sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
| tokens_length
sequencelengths 1
353
| input_texts
sequencelengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3d0263f6d00fe3122da2ba54b66bb15cd5df3f16 |
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-dare
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-34B-dare](https://huggingface.co/louisbrulenaudet/Pearl-34B-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T04:00:24.953384](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare/blob/main/results_2024-02-13T04-00-24.953384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7587346915989489,
"acc_stderr": 0.028351398155327497,
"acc_norm": 0.763853167003337,
"acc_norm_stderr": 0.028878179543793354,
"mc1": 0.5116279069767442,
"mc1_stderr": 0.017498767175740084,
"mc2": 0.6850136264565471,
"mc2_stderr": 0.014412881216443527
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620192,
"acc_norm": 0.6843003412969283,
"acc_norm_stderr": 0.013582571095815291
},
"harness|hellaswag|10": {
"acc": 0.6395140410276837,
"acc_stderr": 0.0047916019756127646,
"acc_norm": 0.8360884285998805,
"acc_norm_stderr": 0.003694387361177659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474945,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474945
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440182,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440182
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7379310344827587,
"acc_stderr": 0.03664666337225257,
"acc_norm": 0.7379310344827587,
"acc_norm_stderr": 0.03664666337225257
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7248677248677249,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.7248677248677249,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103452,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103452
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199505,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.011464523356953162,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.011464523356953162
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8128205128205128,
"acc_stderr": 0.019776601086550032,
"acc_norm": 0.8128205128205128,
"acc_norm_stderr": 0.019776601086550032
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.03044452852881074,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.03044452852881074
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02476290267805793,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02476290267805793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116245,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316942,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316942
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.01849831520686538,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.01849831520686538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.030381596756651655,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.030381596756651655
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9029374201787995,
"acc_stderr": 0.0105864747120183,
"acc_norm": 0.9029374201787995,
"acc_norm_stderr": 0.0105864747120183
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575277,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8078212290502793,
"acc_stderr": 0.013177759505210093,
"acc_norm": 0.8078212290502793,
"acc_norm_stderr": 0.013177759505210093
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8398692810457516,
"acc_stderr": 0.020998740930362303,
"acc_norm": 0.8398692810457516,
"acc_norm_stderr": 0.020998740930362303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571846,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6347517730496454,
"acc_stderr": 0.02872386385328127,
"acc_norm": 0.6347517730496454,
"acc_norm_stderr": 0.02872386385328127
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5743155149934811,
"acc_stderr": 0.012628393551811942,
"acc_norm": 0.5743155149934811,
"acc_norm_stderr": 0.012628393551811942
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.023345163616544855,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.023345163616544855
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8104575163398693,
"acc_stderr": 0.015856152189980252,
"acc_norm": 0.8104575163398693,
"acc_norm_stderr": 0.015856152189980252
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824657,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789256,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789256
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5116279069767442,
"mc1_stderr": 0.017498767175740084,
"mc2": 0.6850136264565471,
"mc2_stderr": 0.014412881216443527
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267198
},
"harness|gsm8k|5": {
"acc": 0.6353297952994693,
"acc_stderr": 0.013258428375662245
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare | [
"region:us"
] | 2024-02-13T04:02:39+00:00 | {"pretty_name": "Evaluation run of louisbrulenaudet/Pearl-34B-dare", "dataset_summary": "Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-34B-dare](https://huggingface.co/louisbrulenaudet/Pearl-34B-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T04:00:24.953384](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare/blob/main/results_2024-02-13T04-00-24.953384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7587346915989489,\n \"acc_stderr\": 0.028351398155327497,\n \"acc_norm\": 0.763853167003337,\n \"acc_norm_stderr\": 0.028878179543793354,\n \"mc1\": 0.5116279069767442,\n \"mc1_stderr\": 0.017498767175740084,\n \"mc2\": 0.6850136264565471,\n \"mc2_stderr\": 0.014412881216443527\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620192,\n \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815291\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6395140410276837,\n \"acc_stderr\": 0.0047916019756127646,\n \"acc_norm\": 0.8360884285998805,\n \"acc_norm_stderr\": 0.003694387361177659\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474945,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474945\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.024774516250440182,\n \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.024774516250440182\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387536,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387536\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7379310344827587,\n \"acc_stderr\": 0.03664666337225257,\n \"acc_norm\": 0.7379310344827587,\n \"acc_norm_stderr\": 0.03664666337225257\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7248677248677249,\n \"acc_stderr\": 0.023000086859068642,\n \"acc_norm\": 0.7248677248677249,\n \"acc_norm_stderr\": 0.023000086859068642\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.01730838128103452,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.01730838128103452\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.011464523356953162,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.011464523356953162\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8128205128205128,\n \"acc_stderr\": 0.019776601086550032,\n \"acc_norm\": 0.8128205128205128,\n \"acc_norm_stderr\": 0.019776601086550032\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.03044452852881074,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.03044452852881074\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02476290267805793,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02476290267805793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.032365852526021574,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.032365852526021574\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316942,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316942\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.01849831520686538,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.01849831520686538\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9029374201787995,\n \"acc_stderr\": 0.0105864747120183,\n \"acc_norm\": 0.9029374201787995,\n \"acc_norm_stderr\": 0.0105864747120183\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575277,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575277\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8078212290502793,\n \"acc_stderr\": 0.013177759505210093,\n \"acc_norm\": 0.8078212290502793,\n \"acc_norm_stderr\": 0.013177759505210093\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8398692810457516,\n \"acc_stderr\": 0.020998740930362303,\n \"acc_norm\": 0.8398692810457516,\n \"acc_norm_stderr\": 0.020998740930362303\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571846,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571846\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6347517730496454,\n \"acc_stderr\": 0.02872386385328127,\n \"acc_norm\": 0.6347517730496454,\n \"acc_norm_stderr\": 0.02872386385328127\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5743155149934811,\n \"acc_stderr\": 0.012628393551811942,\n \"acc_norm\": 0.5743155149934811,\n \"acc_norm_stderr\": 0.012628393551811942\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.023345163616544855,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.023345163616544855\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8104575163398693,\n \"acc_stderr\": 0.015856152189980252,\n \"acc_norm\": 0.8104575163398693,\n \"acc_norm_stderr\": 0.015856152189980252\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824657,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.5903614457831325,\n \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789256,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789256\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5116279069767442,\n \"mc1_stderr\": 0.017498767175740084,\n \"mc2\": 0.6850136264565471,\n \"mc2_stderr\": 0.014412881216443527\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267198\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6353297952994693,\n \"acc_stderr\": 0.013258428375662245\n }\n}\n```", "repo_url": "https://huggingface.co/louisbrulenaudet/Pearl-34B-dare", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|arc:challenge|25_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|gsm8k|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hellaswag|10_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T04-00-24.953384.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["**/details_harness|winogrande|5_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T04-00-24.953384.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T04_00_24.953384", "path": ["results_2024-02-13T04-00-24.953384.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T04-00-24.953384.parquet"]}]}]} | 2024-02-13T04:03:01+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-dare
Dataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-34B-dare on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T04:00:24.953384(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-dare\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-34B-dare on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T04:00:24.953384(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-dare\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-34B-dare on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T04:00:24.953384(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-dare\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-34B-dare on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T04:00:24.953384(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
175224e6caed68a328c62d4e10cdee99f5524fcd |
# Dataset Card for Evaluation run of Gille/StrangeMerges_23-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_23-7B-slerp](https://huggingface.co/Gille/StrangeMerges_23-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T04:04:45.787844](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp/blob/main/results_2024-02-13T04-04-45.787844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.654991806635409,
"acc_stderr": 0.03201532056571857,
"acc_norm": 0.654281003985643,
"acc_norm_stderr": 0.03268521672767465,
"mc1": 0.605875152998776,
"mc1_stderr": 0.017106588140700325,
"mc2": 0.7513000721365315,
"mc2_stderr": 0.014241360525807377
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266127,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.7181836287592113,
"acc_stderr": 0.004489648865080876,
"acc_norm": 0.8889663413662617,
"acc_norm_stderr": 0.0031353173122281234
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652456,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365547,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.605875152998776,
"mc1_stderr": 0.017106588140700325,
"mc2": 0.7513000721365315,
"mc2_stderr": 0.014241360525807377
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598484
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624174
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp | [
"region:us"
] | 2024-02-13T04:07:04+00:00 | {"pretty_name": "Evaluation run of Gille/StrangeMerges_23-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_23-7B-slerp](https://huggingface.co/Gille/StrangeMerges_23-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T04:04:45.787844](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_23-7B-slerp/blob/main/results_2024-02-13T04-04-45.787844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.654991806635409,\n \"acc_stderr\": 0.03201532056571857,\n \"acc_norm\": 0.654281003985643,\n \"acc_norm_stderr\": 0.03268521672767465,\n \"mc1\": 0.605875152998776,\n \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.7513000721365315,\n \"mc2_stderr\": 0.014241360525807377\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266127,\n \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7181836287592113,\n \"acc_stderr\": 0.004489648865080876,\n \"acc_norm\": 0.8889663413662617,\n \"acc_norm_stderr\": 0.0031353173122281234\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652456,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652456\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365547,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365547\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.605875152998776,\n \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.7513000721365315,\n \"mc2_stderr\": 0.014241360525807377\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598484\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \"acc_stderr\": 0.012588685966624174\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_23-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|arc:challenge|25_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|gsm8k|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hellaswag|10_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T04-04-45.787844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["**/details_harness|winogrande|5_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T04-04-45.787844.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T04_04_45.787844", "path": ["results_2024-02-13T04-04-45.787844.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T04-04-45.787844.parquet"]}]}]} | 2024-02-13T04:07:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Gille/StrangeMerges_23-7B-slerp
Dataset automatically created during the evaluation run of model Gille/StrangeMerges_23-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T04:04:45.787844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Gille/StrangeMerges_23-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_23-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T04:04:45.787844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Gille/StrangeMerges_23-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_23-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T04:04:45.787844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Gille/StrangeMerges_23-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_23-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T04:04:45.787844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
a77b3873827d8d141d69602c524072b844e0ad4e | # Dataset Card for LAION DALL·E 3 Discord Dataset
**Description**: This dataset consists of caption and image pairs scraped from the LAION [share-dalle-3 discord channel](https://discord.com/channels/823813159592001537/1158354590463447092). The purpose is to collect image-text pairs for research and exploration.
**Source Code**: The code used to generate this data can be found [here](https://github.com/LAION-AI/Discord-Scrapers.git).
## Contributors
- [Zach Nagengast](https://github.com/ZachNagengast)
- [Eduardo Pach](https://github.com/EduardoPach)
- [Seva Maltsev](https://github.com/TwoAbove)
- The [LAION community](https://discord.com/invite/eq3cAMZtCC)
## Data Attributes
- **caption**: The text description or prompt associated with the image. Data type: string.
- **image**: The embedded image data from the discord message attachment. Data type: image.
- **link**: The URL to the associated image. Data type: string.
- **message_id**: The discord message id where the image was posted. Data type: string.
- **timestamp**: Time the original message was posted. Datatype: string. | OpenDatasets/dalle-3-dataset | [
"language:en",
"license:cc0-1.0",
"image-text-dataset",
"synthetic-dataset",
"region:us"
] | 2024-02-13T04:17:46+00:00 | {"language": ["en"], "license": ["cc0-1.0"], "tags": ["image-text-dataset", "synthetic-dataset"], "dataset_info": {"features": [{"name": "caption", "dtype": "string"}, {"name": "image", "dtype": "image"}, {"name": "link", "dtype": "string"}, {"name": "message_id", "dtype": "string"}, {"name": "timestamp", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25851562139.271, "num_examples": 14927}], "download_size": 25829593712, "dataset_size": 25851562139.271}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T07:15:56+00:00 | [] | [
"en"
] | TAGS
#language-English #license-cc0-1.0 #image-text-dataset #synthetic-dataset #region-us
| # Dataset Card for LAION DALL·E 3 Discord Dataset
Description: This dataset consists of caption and image pairs scraped from the LAION share-dalle-3 discord channel. The purpose is to collect image-text pairs for research and exploration.
Source Code: The code used to generate this data can be found here.
## Contributors
- Zach Nagengast
- Eduardo Pach
- Seva Maltsev
- The LAION community
## Data Attributes
- caption: The text description or prompt associated with the image. Data type: string.
- image: The embedded image data from the discord message attachment. Data type: image.
- link: The URL to the associated image. Data type: string.
- message_id: The discord message id where the image was posted. Data type: string.
- timestamp: Time the original message was posted. Datatype: string. | [
"# Dataset Card for LAION DALL·E 3 Discord Dataset\n\nDescription: This dataset consists of caption and image pairs scraped from the LAION share-dalle-3 discord channel. The purpose is to collect image-text pairs for research and exploration.\n\nSource Code: The code used to generate this data can be found here.",
"## Contributors\n\n- Zach Nagengast\n- Eduardo Pach\n- Seva Maltsev\n- The LAION community",
"## Data Attributes\n\n- caption: The text description or prompt associated with the image. Data type: string.\n- image: The embedded image data from the discord message attachment. Data type: image.\n- link: The URL to the associated image. Data type: string.\n- message_id: The discord message id where the image was posted. Data type: string.\n- timestamp: Time the original message was posted. Datatype: string."
] | [
"TAGS\n#language-English #license-cc0-1.0 #image-text-dataset #synthetic-dataset #region-us \n",
"# Dataset Card for LAION DALL·E 3 Discord Dataset\n\nDescription: This dataset consists of caption and image pairs scraped from the LAION share-dalle-3 discord channel. The purpose is to collect image-text pairs for research and exploration.\n\nSource Code: The code used to generate this data can be found here.",
"## Contributors\n\n- Zach Nagengast\n- Eduardo Pach\n- Seva Maltsev\n- The LAION community",
"## Data Attributes\n\n- caption: The text description or prompt associated with the image. Data type: string.\n- image: The embedded image data from the discord message attachment. Data type: image.\n- link: The URL to the associated image. Data type: string.\n- message_id: The discord message id where the image was posted. Data type: string.\n- timestamp: Time the original message was posted. Datatype: string."
] | [
32,
74,
23,
96
] | [
"passage: TAGS\n#language-English #license-cc0-1.0 #image-text-dataset #synthetic-dataset #region-us \n# Dataset Card for LAION DALL·E 3 Discord Dataset\n\nDescription: This dataset consists of caption and image pairs scraped from the LAION share-dalle-3 discord channel. The purpose is to collect image-text pairs for research and exploration.\n\nSource Code: The code used to generate this data can be found here.## Contributors\n\n- Zach Nagengast\n- Eduardo Pach\n- Seva Maltsev\n- The LAION community## Data Attributes\n\n- caption: The text description or prompt associated with the image. Data type: string.\n- image: The embedded image data from the discord message attachment. Data type: image.\n- link: The URL to the associated image. Data type: string.\n- message_id: The discord message id where the image was posted. Data type: string.\n- timestamp: Time the original message was posted. Datatype: string."
] |
e667c43221bf817824960e9bc8dde372e7b17ca1 |
# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-Bacchus
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/MiniCPM-3B-Bacchus](https://huggingface.co/indischepartij/MiniCPM-3B-Bacchus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Bacchus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T04:24:43.886690](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Bacchus/blob/main/results_2024-02-13T04-24-43.886690.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5043816466809818,
"acc_stderr": 0.034512563967863674,
"acc_norm": 0.50707380697781,
"acc_norm_stderr": 0.03522451893789315,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.43515598064870864,
"mc2_stderr": 0.014518057161705074
},
"harness|arc:challenge|25": {
"acc": 0.40102389078498296,
"acc_stderr": 0.014322255790719869,
"acc_norm": 0.4351535836177474,
"acc_norm_stderr": 0.014487986197186045
},
"harness|hellaswag|10": {
"acc": 0.5152360087631945,
"acc_stderr": 0.004987464257999318,
"acc_norm": 0.7045409281019717,
"acc_norm_stderr": 0.004553164013379556
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5169811320754717,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.5169811320754717,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.031907012423268113,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.031907012423268113
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899207,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5935483870967742,
"acc_stderr": 0.02794172734625631,
"acc_norm": 0.5935483870967742,
"acc_norm_stderr": 0.02794172734625631
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.038881769216741004,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.038881769216741004
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.03416903640391522,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.03416903640391522
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945287,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945287
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6201834862385321,
"acc_stderr": 0.020808825617866244,
"acc_norm": 0.6201834862385321,
"acc_norm_stderr": 0.020808825617866244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.03236585252602157,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.03236585252602157
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239172,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239172
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.043389203057924,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.043389203057924
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199985,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199985
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6334610472541508,
"acc_stderr": 0.01723124462679704,
"acc_norm": 0.6334610472541508,
"acc_norm_stderr": 0.01723124462679704
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438881,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438881
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.028358956313423545,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.028358956313423545
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5691318327974276,
"acc_stderr": 0.028125340983972708,
"acc_norm": 0.5691318327974276,
"acc_norm_stderr": 0.028125340983972708
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.027667138569422715,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.027667138569422715
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590954,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590954
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.379400260756193,
"acc_stderr": 0.012393202029825398,
"acc_norm": 0.379400260756193,
"acc_norm_stderr": 0.012393202029825398
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.02967428828131118,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.02967428828131118
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47875816993464054,
"acc_stderr": 0.020209572388600265,
"acc_norm": 0.47875816993464054,
"acc_norm_stderr": 0.020209572388600265
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5959183673469388,
"acc_stderr": 0.03141470802586589,
"acc_norm": 0.5959183673469388,
"acc_norm_stderr": 0.03141470802586589
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296035,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296035
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.43515598064870864,
"mc2_stderr": 0.014518057161705074
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.013230397198964657
},
"harness|gsm8k|5": {
"acc": 0.4048521607278241,
"acc_stderr": 0.0135208176668705
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Bacchus | [
"region:us"
] | 2024-02-13T04:26:27+00:00 | {"pretty_name": "Evaluation run of indischepartij/MiniCPM-3B-Bacchus", "dataset_summary": "Dataset automatically created during the evaluation run of model [indischepartij/MiniCPM-3B-Bacchus](https://huggingface.co/indischepartij/MiniCPM-3B-Bacchus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Bacchus\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T04:24:43.886690](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Bacchus/blob/main/results_2024-02-13T04-24-43.886690.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5043816466809818,\n \"acc_stderr\": 0.034512563967863674,\n \"acc_norm\": 0.50707380697781,\n \"acc_norm_stderr\": 0.03522451893789315,\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.43515598064870864,\n \"mc2_stderr\": 0.014518057161705074\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.40102389078498296,\n \"acc_stderr\": 0.014322255790719869,\n \"acc_norm\": 0.4351535836177474,\n \"acc_norm_stderr\": 0.014487986197186045\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5152360087631945,\n \"acc_stderr\": 0.004987464257999318,\n \"acc_norm\": 0.7045409281019717,\n \"acc_norm_stderr\": 0.004553164013379556\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.031907012423268113,\n \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.031907012423268113\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899207,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899207\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5935483870967742,\n \"acc_stderr\": 0.02794172734625631,\n \"acc_norm\": 0.5935483870967742,\n \"acc_norm_stderr\": 0.02794172734625631\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.038881769216741004,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.038881769216741004\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391522,\n \"acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391522\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945287,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945287\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6201834862385321,\n \"acc_stderr\": 0.020808825617866244,\n \"acc_norm\": 0.6201834862385321,\n \"acc_norm_stderr\": 0.020808825617866244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602157,\n \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602157\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239172,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239172\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.043389203057924,\n \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.043389203057924\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04750077341199985,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04750077341199985\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6334610472541508,\n \"acc_stderr\": 0.01723124462679704,\n \"acc_norm\": 0.6334610472541508,\n \"acc_norm_stderr\": 0.01723124462679704\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098177,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098177\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n \"acc_stderr\": 0.014776765066438881,\n \"acc_norm\": 0.2659217877094972,\n \"acc_norm_stderr\": 0.014776765066438881\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.028358956313423545,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.028358956313423545\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n \"acc_stderr\": 0.028125340983972708,\n \"acc_norm\": 0.5691318327974276,\n \"acc_norm_stderr\": 0.028125340983972708\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422715,\n \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422715\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590954,\n \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590954\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.379400260756193,\n \"acc_stderr\": 0.012393202029825398,\n \"acc_norm\": 0.379400260756193,\n \"acc_norm_stderr\": 0.012393202029825398\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.02967428828131118,\n \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.02967428828131118\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.47875816993464054,\n \"acc_stderr\": 0.020209572388600265,\n \"acc_norm\": 0.47875816993464054,\n \"acc_norm_stderr\": 0.020209572388600265\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.03141470802586589,\n \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.03141470802586589\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.030769444967296035,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.030769444967296035\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.43515598064870864,\n \"mc2_stderr\": 0.014518057161705074\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.013230397198964657\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4048521607278241,\n \"acc_stderr\": 0.0135208176668705\n }\n}\n```", "repo_url": "https://huggingface.co/indischepartij/MiniCPM-3B-Bacchus", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|arc:challenge|25_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|gsm8k|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hellaswag|10_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T04-24-43.886690.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["**/details_harness|winogrande|5_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T04-24-43.886690.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T04_24_43.886690", "path": ["results_2024-02-13T04-24-43.886690.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T04-24-43.886690.parquet"]}]}]} | 2024-02-13T04:26:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-Bacchus
Dataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-Bacchus on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T04:24:43.886690(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-Bacchus\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-Bacchus on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T04:24:43.886690(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-Bacchus\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-Bacchus on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T04:24:43.886690(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-Bacchus\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-Bacchus on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T04:24:43.886690(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
e9849a5e1dd08be1429cf99d4100596f32930859 | # Dataset Card for "marriage_classification_1738_1951"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Gbssreejith/marriage_classification_1738_1951 | [
"region:us"
] | 2024-02-13T05:05:15+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "blank", "1": "type1", "2": "type2"}}}}, {"name": "ground_truth", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 139325572.0, "num_examples": 425}, {"name": "test", "num_bytes": 12347904.0, "num_examples": 35}], "download_size": 145568458, "dataset_size": 151673476.0}} | 2024-02-13T05:05:52+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "marriage_classification_1738_1951"
More Information needed | [
"# Dataset Card for \"marriage_classification_1738_1951\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"marriage_classification_1738_1951\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"marriage_classification_1738_1951\"\n\nMore Information needed"
] |
dd2516a2dc3397dcc91195e3077b187a4bf31eb2 |
### How to use:
```python
pip install datasets
```
```python
dataset = load_dataset("mabughali/miia-pothole-train", split="train")
splits = dataset.train_test_split(test_size=0.2)
train_ds = splits['train']
val_ds = splits['test']
```
| mabughali/miia-pothole-train | [
"license:apache-2.0",
"region:us"
] | 2024-02-13T05:26:24+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "0", "1": "1"}}}}], "splits": [{"name": "train", "num_bytes": 214840332.522, "num_examples": 4026}], "download_size": 219415699, "dataset_size": 214840332.522}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-13T15:57:04+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
### How to use:
| [
"### How to use:"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"### How to use:"
] | [
14,
6
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n### How to use:"
] |
596853dc88ddd6e75a2075ae3f2afee31374a6d3 |
# Dataset Card for Evaluation run of InnerI/InnerILLM-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [InnerI/InnerILLM-7B-slerp](https://huggingface.co/InnerI/InnerILLM-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_InnerI__InnerILLM-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T05:45:44.319472](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerILLM-7B-slerp/blob/main/results_2024-02-13T05-45-44.319472.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6453218800457399,
"acc_stderr": 0.03212887690836472,
"acc_norm": 0.6457679471487517,
"acc_norm_stderr": 0.032784859928949854,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598389086821388,
"mc2_stderr": 0.015156739153282793
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.013990571137918762,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518827
},
"harness|hellaswag|10": {
"acc": 0.6697868950408286,
"acc_stderr": 0.004693285694663837,
"acc_norm": 0.8618801035650269,
"acc_norm_stderr": 0.003443206472757467
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055263,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055263
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36201117318435755,
"acc_stderr": 0.016073067350153087,
"acc_norm": 0.36201117318435755,
"acc_norm_stderr": 0.016073067350153087
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015057,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015057
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598389086821388,
"mc2_stderr": 0.015156739153282793
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515303
},
"harness|gsm8k|5": {
"acc": 0.6868840030326004,
"acc_stderr": 0.012774285669385084
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_InnerI__InnerILLM-7B-slerp | [
"region:us"
] | 2024-02-13T05:42:08+00:00 | {"pretty_name": "Evaluation run of InnerI/InnerILLM-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [InnerI/InnerILLM-7B-slerp](https://huggingface.co/InnerI/InnerILLM-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_InnerI__InnerILLM-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T05:45:44.319472](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerILLM-7B-slerp/blob/main/results_2024-02-13T05-45-44.319472.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6453218800457399,\n \"acc_stderr\": 0.03212887690836472,\n \"acc_norm\": 0.6457679471487517,\n \"acc_norm_stderr\": 0.032784859928949854,\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598389086821388,\n \"mc2_stderr\": 0.015156739153282793\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.013990571137918762,\n \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518827\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6697868950408286,\n \"acc_stderr\": 0.004693285694663837,\n \"acc_norm\": 0.8618801035650269,\n \"acc_norm_stderr\": 0.003443206472757467\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055263,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055263\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015057,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015057\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174934,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174934\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598389086821388,\n \"mc2_stderr\": 0.015156739153282793\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515303\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6868840030326004,\n \"acc_stderr\": 0.012774285669385084\n }\n}\n```", "repo_url": "https://huggingface.co/InnerI/InnerILLM-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|arc:challenge|25_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|arc:challenge|25_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|gsm8k|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|gsm8k|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hellaswag|10_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hellaswag|10_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T05-39-48.992789.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T05-45-44.319472.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["**/details_harness|winogrande|5_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["**/details_harness|winogrande|5_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T05-45-44.319472.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T05_39_48.992789", "path": ["results_2024-02-13T05-39-48.992789.parquet"]}, {"split": "2024_02_13T05_45_44.319472", "path": ["results_2024-02-13T05-45-44.319472.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T05-45-44.319472.parquet"]}]}]} | 2024-02-13T05:48:06+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of InnerI/InnerILLM-7B-slerp
Dataset automatically created during the evaluation run of model InnerI/InnerILLM-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T05:45:44.319472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of InnerI/InnerILLM-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model InnerI/InnerILLM-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T05:45:44.319472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of InnerI/InnerILLM-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model InnerI/InnerILLM-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T05:45:44.319472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of InnerI/InnerILLM-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model InnerI/InnerILLM-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T05:45:44.319472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
1c3731ba5b2425824ac1a0b941811d5c0baa4fcf |
# Dataset Card for Evaluation run of uukuguy/speechless-thoughts-mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-thoughts-mistral-7b](https://huggingface.co/uukuguy/speechless-thoughts-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T05:55:56.018611](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b/blob/main/results_2024-02-13T05-55-56.018611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5986195523064864,
"acc_stderr": 0.03298640421694122,
"acc_norm": 0.6043515940265466,
"acc_norm_stderr": 0.03367310097017117,
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986886,
"mc2": 0.4991408266122745,
"mc2_stderr": 0.015158832970770511
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344074,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642662
},
"harness|hellaswag|10": {
"acc": 0.615116510655248,
"acc_stderr": 0.004855733568540264,
"acc_norm": 0.8071101374228241,
"acc_norm_stderr": 0.003937609275348463
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849723,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849723
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.037786210790920566,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.037786210790920566
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.0259885007924119,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.0259885007924119
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397447,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868578,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868578
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803057,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803057
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654667,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654667
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369923,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889136,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889136
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125474,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125474
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281508,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986886,
"mc2": 0.4991408266122745,
"mc2_stderr": 0.015158832970770511
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.3078089461713419,
"acc_stderr": 0.012714401009923652
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b | [
"region:us"
] | 2024-02-13T05:58:17+00:00 | {"pretty_name": "Evaluation run of uukuguy/speechless-thoughts-mistral-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-thoughts-mistral-7b](https://huggingface.co/uukuguy/speechless-thoughts-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T05:55:56.018611](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b/blob/main/results_2024-02-13T05-55-56.018611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5986195523064864,\n \"acc_stderr\": 0.03298640421694122,\n \"acc_norm\": 0.6043515940265466,\n \"acc_norm_stderr\": 0.03367310097017117,\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.016684419859986886,\n \"mc2\": 0.4991408266122745,\n \"mc2_stderr\": 0.015158832970770511\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344074,\n \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642662\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.615116510655248,\n \"acc_stderr\": 0.004855733568540264,\n \"acc_norm\": 0.8071101374228241,\n \"acc_norm_stderr\": 0.003937609275348463\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849723,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849723\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.037786210790920566,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.037786210790920566\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n \"acc_stderr\": 0.0259885007924119,\n \"acc_norm\": 0.7032258064516129,\n \"acc_norm_stderr\": 0.0259885007924119\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397447,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868578,\n \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868578\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803057,\n \"acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803057\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654667,\n \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654667\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n \"acc_stderr\": 0.014551553659369923,\n \"acc_norm\": 0.2536312849162011,\n \"acc_norm_stderr\": 0.014551553659369923\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n \"acc_stderr\": 0.012712265105889136,\n \"acc_norm\": 0.45241199478487615,\n \"acc_norm_stderr\": 0.012712265105889136\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125474,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125474\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281508,\n \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281508\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.016684419859986886,\n \"mc2\": 0.4991408266122745,\n \"mc2_stderr\": 0.015158832970770511\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3078089461713419,\n \"acc_stderr\": 0.012714401009923652\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-thoughts-mistral-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|arc:challenge|25_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|gsm8k|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hellaswag|10_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T05-55-56.018611.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["**/details_harness|winogrande|5_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T05-55-56.018611.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T05_55_56.018611", "path": ["results_2024-02-13T05-55-56.018611.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T05-55-56.018611.parquet"]}]}]} | 2024-02-13T05:58:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-thoughts-mistral-7b
Dataset automatically created during the evaluation run of model uukuguy/speechless-thoughts-mistral-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T05:55:56.018611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of uukuguy/speechless-thoughts-mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-thoughts-mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T05:55:56.018611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-thoughts-mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-thoughts-mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T05:55:56.018611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-thoughts-mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-thoughts-mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T05:55:56.018611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
0a4efdc086aecbc32107ea620cbfcaf8864b656c | [Source](https://theinfosphere.org/Episode_Transcript_Listing)
```
3.9% Cleaned.
```
**You should avoid using this set unless you *clean it yourself*, or wait until I've finished cleaning it and changed this text to say it's good to go.**
Intended for use with Axolotl with this [custom version of ShareGPT](https://github.com/xzuyn/axolotl/blob/dan_metharme/src/axolotl/prompt_strategies/dan_metharme_chat.py).
If an item has an `'episode_number'`, it is cleaned or mostly cleaned.
Average length of each episode seems to be ~10K mistral tokens. | PJMixers/Futurama-CustomShareGPT | [
"size_categories:n<1K",
"language:en",
"region:us"
] | 2024-02-13T06:58:18+00:00 | {"language": ["en"], "size_categories": ["n<1K"]} | 2024-02-15T19:07:50+00:00 | [] | [
"en"
] | TAGS
#size_categories-n<1K #language-English #region-us
| Source
You should avoid using this set unless you *clean it yourself*, or wait until I've finished cleaning it and changed this text to say it's good to go.
Intended for use with Axolotl with this custom version of ShareGPT.
If an item has an ''episode_number'', it is cleaned or mostly cleaned.
Average length of each episode seems to be ~10K mistral tokens. | [] | [
"TAGS\n#size_categories-n<1K #language-English #region-us \n"
] | [
20
] | [
"passage: TAGS\n#size_categories-n<1K #language-English #region-us \n"
] |
42ce9d531bed9152f40a580458c5c2923179f7df |
# Dataset Card for Evaluation run of saishf/Fimbulvetr-Kuro-Lotus-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saishf/Fimbulvetr-Kuro-Lotus-10.7B](https://huggingface.co/saishf/Fimbulvetr-Kuro-Lotus-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saishf__Fimbulvetr-Kuro-Lotus-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T07:02:24.840541](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Fimbulvetr-Kuro-Lotus-10.7B/blob/main/results_2024-02-13T07-02-24.840541.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6729740857071812,
"acc_stderr": 0.03133709945707105,
"acc_norm": 0.6739144680121636,
"acc_norm_stderr": 0.03197661295643159,
"mc1": 0.4602203182374541,
"mc1_stderr": 0.017448017223960884,
"mc2": 0.6095070151643451,
"mc2_stderr": 0.015534848379967322
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205765,
"acc_norm": 0.6953924914675768,
"acc_norm_stderr": 0.013449522109932485
},
"harness|hellaswag|10": {
"acc": 0.6906990639314877,
"acc_stderr": 0.004612608206670405,
"acc_norm": 0.8787094204341764,
"acc_norm_stderr": 0.0032579745937899446
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810535,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810535
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.0256469283610494,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.0256469283610494
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603915,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188703,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188703
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.023784297520918856,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.023784297520918856
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.869198312236287,
"acc_stderr": 0.02194876605947076,
"acc_norm": 0.869198312236287,
"acc_norm_stderr": 0.02194876605947076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798824,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.0215864940012814,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.0215864940012814
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973147,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973147
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48156424581005586,
"acc_stderr": 0.01671113049778282,
"acc_norm": 0.48156424581005586,
"acc_norm_stderr": 0.01671113049778282
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02313237623454334,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02313237623454334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.524822695035461,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.524822695035461,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4973924380704042,
"acc_stderr": 0.012770062445433172,
"acc_norm": 0.4973924380704042,
"acc_norm_stderr": 0.012770062445433172
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7075163398692811,
"acc_stderr": 0.018403415710109797,
"acc_norm": 0.7075163398692811,
"acc_norm_stderr": 0.018403415710109797
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827054,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4602203182374541,
"mc1_stderr": 0.017448017223960884,
"mc2": 0.6095070151643451,
"mc2_stderr": 0.015534848379967322
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028224
},
"harness|gsm8k|5": {
"acc": 0.66868840030326,
"acc_stderr": 0.012964999679688664
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_saishf__Fimbulvetr-Kuro-Lotus-10.7B | [
"region:us"
] | 2024-02-13T07:04:44+00:00 | {"pretty_name": "Evaluation run of saishf/Fimbulvetr-Kuro-Lotus-10.7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [saishf/Fimbulvetr-Kuro-Lotus-10.7B](https://huggingface.co/saishf/Fimbulvetr-Kuro-Lotus-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saishf__Fimbulvetr-Kuro-Lotus-10.7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T07:02:24.840541](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Fimbulvetr-Kuro-Lotus-10.7B/blob/main/results_2024-02-13T07-02-24.840541.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6729740857071812,\n \"acc_stderr\": 0.03133709945707105,\n \"acc_norm\": 0.6739144680121636,\n \"acc_norm_stderr\": 0.03197661295643159,\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.017448017223960884,\n \"mc2\": 0.6095070151643451,\n \"mc2_stderr\": 0.015534848379967322\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205765,\n \"acc_norm\": 0.6953924914675768,\n \"acc_norm_stderr\": 0.013449522109932485\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6906990639314877,\n \"acc_stderr\": 0.004612608206670405,\n \"acc_norm\": 0.8787094204341764,\n \"acc_norm_stderr\": 0.0032579745937899446\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810535,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810535\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.455026455026455,\n \"acc_stderr\": 0.0256469283610494,\n \"acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.0256469283610494\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603915,\n \"acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188703,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188703\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8676470588235294,\n \"acc_stderr\": 0.023784297520918856,\n \"acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.023784297520918856\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.869198312236287,\n \"acc_stderr\": 0.02194876605947076,\n \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.02194876605947076\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n \"acc_stderr\": 0.029918586707798824,\n \"acc_norm\": 0.726457399103139,\n \"acc_norm_stderr\": 0.029918586707798824\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.0215864940012814,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.0215864940012814\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973147,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973147\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48156424581005586,\n \"acc_stderr\": 0.01671113049778282,\n \"acc_norm\": 0.48156424581005586,\n \"acc_norm_stderr\": 0.01671113049778282\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02313237623454334,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02313237623454334\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.524822695035461,\n \"acc_stderr\": 0.029790719243829714,\n \"acc_norm\": 0.524822695035461,\n \"acc_norm_stderr\": 0.029790719243829714\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4973924380704042,\n \"acc_stderr\": 0.012770062445433172,\n \"acc_norm\": 0.4973924380704042,\n \"acc_norm_stderr\": 0.012770062445433172\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7075163398692811,\n \"acc_stderr\": 0.018403415710109797,\n \"acc_norm\": 0.7075163398692811,\n \"acc_norm_stderr\": 0.018403415710109797\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827054,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.017448017223960884,\n \"mc2\": 0.6095070151643451,\n \"mc2_stderr\": 0.015534848379967322\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028224\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.66868840030326,\n \"acc_stderr\": 0.012964999679688664\n }\n}\n```", "repo_url": "https://huggingface.co/saishf/Fimbulvetr-Kuro-Lotus-10.7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-02-24.840541.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["**/details_harness|winogrande|5_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T07-02-24.840541.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T07_02_24.840541", "path": ["results_2024-02-13T07-02-24.840541.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T07-02-24.840541.parquet"]}]}]} | 2024-02-13T07:05:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of saishf/Fimbulvetr-Kuro-Lotus-10.7B
Dataset automatically created during the evaluation run of model saishf/Fimbulvetr-Kuro-Lotus-10.7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T07:02:24.840541(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of saishf/Fimbulvetr-Kuro-Lotus-10.7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Fimbulvetr-Kuro-Lotus-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:02:24.840541(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of saishf/Fimbulvetr-Kuro-Lotus-10.7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Fimbulvetr-Kuro-Lotus-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:02:24.840541(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of saishf/Fimbulvetr-Kuro-Lotus-10.7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Fimbulvetr-Kuro-Lotus-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T07:02:24.840541(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
3cbc01339659c7c53d2bf21cfaa1db3d34110841 |
Dataset for developing controlnet conditional on landmark.
The original dataset is a size 256 resized CelebAHQ.
Use facial landmark information and express the nose in red, mouth white, left eye, iris, eyebrows in green, right eye, iris, and eyebrows in blue.
Create a description of the face using BLIP.
Create facial landmark information using Mediapipe.
| saeu5407/celebahq_landmark4controlnet | [
"region:us"
] | 2024-02-13T07:08:20+00:00 | {"dataset_info": {"features": [{"name": "crop_image", "dtype": "image"}, {"name": "landmark_image", "dtype": "image"}, {"name": "prompt_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4621966015.75, "num_examples": 26965}, {"name": "validation", "num_bytes": 515133042.0, "num_examples": 3000}], "download_size": 5137062206, "dataset_size": 5137099057.75}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-13T08:50:57+00:00 | [] | [] | TAGS
#region-us
|
Dataset for developing controlnet conditional on landmark.
The original dataset is a size 256 resized CelebAHQ.
Use facial landmark information and express the nose in red, mouth white, left eye, iris, eyebrows in green, right eye, iris, and eyebrows in blue.
Create a description of the face using BLIP.
Create facial landmark information using Mediapipe.
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
4d09431a9e358720589100330b5ba72ffa32b8aa | # Dataset Card for "high_vs_randommin_100_issues_per_repo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | kristmh/high_vs_randommin_100_issues_per_repo | [
"region:us"
] | 2024-02-13T07:20:34+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validate", "path": "data/validate-*"}]}], "dataset_info": {"features": [{"name": "Unnamed: 0", "dtype": "int64"}, {"name": "text_clean", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 13172272, "num_examples": 11681}, {"name": "train", "num_bytes": 103353722, "num_examples": 93441}, {"name": "validate", "num_bytes": 13025230, "num_examples": 11680}], "download_size": 61083853, "dataset_size": 129551224}} | 2024-02-13T07:21:00+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "high_vs_randommin_100_issues_per_repo"
More Information needed | [
"# Dataset Card for \"high_vs_randommin_100_issues_per_repo\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"high_vs_randommin_100_issues_per_repo\"\n\nMore Information needed"
] | [
6,
26
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"high_vs_randommin_100_issues_per_repo\"\n\nMore Information needed"
] |
b9e6292250cc732d10d9ccc38314f678d4e977e0 |
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T07:27:37.172195](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta/blob/main/results_2024-02-13T07-27-37.172195.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.595356587300876,
"acc_stderr": 0.03311822764879789,
"acc_norm": 0.6057673454107737,
"acc_norm_stderr": 0.0339087917676742,
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150535,
"mc2": 0.4975919881917549,
"mc2_stderr": 0.01579574647682552
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.014521226405627079,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009128
},
"harness|hellaswag|10": {
"acc": 0.6152160924118701,
"acc_stderr": 0.004855498343308391,
"acc_norm": 0.8133837880900219,
"acc_norm_stderr": 0.0038880689432920727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.02930010170554965,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.02930010170554965
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033583,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033583
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315525,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.031544498882702866,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.031544498882702866
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040696,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040696
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647907,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647907
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.01596103667523096,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.01596103667523096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409818,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409818
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41199478487614083,
"acc_stderr": 0.012570871032146073,
"acc_norm": 0.41199478487614083,
"acc_norm_stderr": 0.012570871032146073
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.019821843688271768,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.019821843688271768
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017197,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017197
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150535,
"mc2": 0.4975919881917549,
"mc2_stderr": 0.01579574647682552
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
},
"harness|gsm8k|5": {
"acc": 0.05686125852918878,
"acc_stderr": 0.006378790242099631
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta | [
"region:us"
] | 2024-02-13T07:30:03+00:00 | {"pretty_name": "Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta", "dataset_summary": "Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T07:27:37.172195](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta/blob/main/results_2024-02-13T07-27-37.172195.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.595356587300876,\n \"acc_stderr\": 0.03311822764879789,\n \"acc_norm\": 0.6057673454107737,\n \"acc_norm_stderr\": 0.0339087917676742,\n \"mc1\": 0.33414932680538556,\n \"mc1_stderr\": 0.016512530677150535,\n \"mc2\": 0.4975919881917549,\n \"mc2_stderr\": 0.01579574647682552\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.014521226405627079,\n \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009128\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6152160924118701,\n \"acc_stderr\": 0.004855498343308391,\n \"acc_norm\": 0.8133837880900219,\n \"acc_norm_stderr\": 0.0038880689432920727\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.5407407407407407,\n \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.02930010170554965,\n \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.02930010170554965\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033583,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033583\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315525,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315525\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.031544498882702866,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.031544498882702866\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.02126271940040696,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.02126271940040696\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647907,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647907\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n \"acc_stderr\": 0.01596103667523096,\n \"acc_norm\": 0.35083798882681566,\n \"acc_norm_stderr\": 0.01596103667523096\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409818,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409818\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41199478487614083,\n \"acc_stderr\": 0.012570871032146073,\n \"acc_norm\": 0.41199478487614083,\n \"acc_norm_stderr\": 0.012570871032146073\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5996732026143791,\n \"acc_stderr\": 0.019821843688271768,\n \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.019821843688271768\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.029475250236017197,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.029475250236017197\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n \"mc1_stderr\": 0.016512530677150535,\n \"mc2\": 0.4975919881917549,\n \"mc2_stderr\": 0.01579574647682552\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05686125852918878,\n \"acc_stderr\": 0.006378790242099631\n }\n}\n```", "repo_url": "https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-27-37.172195.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["**/details_harness|winogrande|5_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T07-27-37.172195.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T07_27_37.172195", "path": ["results_2024-02-13T07-27-37.172195.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T07-27-37.172195.parquet"]}]}]} | 2024-02-13T07:30:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta
Dataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T07:27:37.172195(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:27:37.172195(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:27:37.172195(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
233,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T07:27:37.172195(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations"
] |
e094080a79519ae7a864eba061ded2920c5d333e |
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T07:34:28.878740](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta/blob/main/results_2024-02-13T07-34-28.878740.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5960695400984635,
"acc_stderr": 0.03323664542144247,
"acc_norm": 0.6045531326002294,
"acc_norm_stderr": 0.03395162360261904,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5235673204188883,
"mc2_stderr": 0.016399852429558985
},
"harness|arc:challenge|25": {
"acc": 0.5460750853242321,
"acc_stderr": 0.014549221105171867,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642662
},
"harness|hellaswag|10": {
"acc": 0.6106353316072496,
"acc_stderr": 0.0048660968809414425,
"acc_norm": 0.7982473610834495,
"acc_norm_stderr": 0.004004883380078933
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.02496268356433179,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.02496268356433179
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787586,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787586
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333564,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333564
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069706,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811942,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811942
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.019643801557924803,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.019643801557924803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.0279626776047689,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.0279626776047689
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5235673204188883,
"mc2_stderr": 0.016399852429558985
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.012441718456893012
},
"harness|gsm8k|5": {
"acc": 0.19029567854435178,
"acc_stderr": 0.01081234728318298
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta | [
"region:us"
] | 2024-02-13T07:36:49+00:00 | {"pretty_name": "Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta", "dataset_summary": "Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T07:34:28.878740](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta/blob/main/results_2024-02-13T07-34-28.878740.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5960695400984635,\n \"acc_stderr\": 0.03323664542144247,\n \"acc_norm\": 0.6045531326002294,\n \"acc_norm_stderr\": 0.03395162360261904,\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5235673204188883,\n \"mc2_stderr\": 0.016399852429558985\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5460750853242321,\n \"acc_stderr\": 0.014549221105171867,\n \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642662\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6106353316072496,\n \"acc_stderr\": 0.0048660968809414425,\n \"acc_norm\": 0.7982473610834495,\n \"acc_norm_stderr\": 0.004004883380078933\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.02496268356433179,\n \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.02496268356433179\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787586,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787586\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n \"acc_stderr\": 0.014836205167333564,\n \"acc_norm\": 0.7790549169859514,\n \"acc_norm_stderr\": 0.014836205167333564\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069706,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069706\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545715,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545715\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n \"acc_stderr\": 0.012628393551811942,\n \"acc_norm\": 0.4256844850065189,\n \"acc_norm_stderr\": 0.012628393551811942\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924803,\n \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924803\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.0279626776047689,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.0279626776047689\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5235673204188883,\n \"mc2_stderr\": 0.016399852429558985\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.012441718456893012\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19029567854435178,\n \"acc_stderr\": 0.01081234728318298\n }\n}\n```", "repo_url": "https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-34-28.878740.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["**/details_harness|winogrande|5_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T07-34-28.878740.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T07_34_28.878740", "path": ["results_2024-02-13T07-34-28.878740.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T07-34-28.878740.parquet"]}]}]} | 2024-02-13T07:37:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta
Dataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T07:34:28.878740(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:34:28.878740(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:34:28.878740(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
233,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T07:34:28.878740(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations"
] |
6460a7a326948bd94ca5d4c271cfabd560291a30 |
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T07:37:59.900682](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta/blob/main/results_2024-02-13T07-37-59.900682.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5887708742073996,
"acc_stderr": 0.033320056261042716,
"acc_norm": 0.5998480729795278,
"acc_norm_stderr": 0.03416670673744603,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5757997033411437,
"mc2_stderr": 0.015647081417738417
},
"harness|arc:challenge|25": {
"acc": 0.5409556313993175,
"acc_stderr": 0.01456229107360123,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.6405098585939056,
"acc_stderr": 0.004788703173474751,
"acc_norm": 0.8253335988846843,
"acc_norm_stderr": 0.003789055487003183
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785742,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785742
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454805,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454805
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630804,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842534,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842534
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008732,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008732
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729245,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729245
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7879948914431673,
"acc_stderr": 0.014616099385833676,
"acc_norm": 0.7879948914431673,
"acc_norm_stderr": 0.014616099385833676
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242826,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242826
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2905027932960894,
"acc_stderr": 0.015183844307206144,
"acc_norm": 0.2905027932960894,
"acc_norm_stderr": 0.015183844307206144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.02718449890994161,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.02718449890994161
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719978,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719978
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.012638223880313167,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.012638223880313167
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.01982184368827176,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.01982184368827176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348377,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348377
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5757997033411437,
"mc2_stderr": 0.015647081417738417
},
"harness|winogrande|5": {
"acc": 0.7490134175217048,
"acc_stderr": 0.012185776220516156
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.0032820559171369023
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta | [
"region:us"
] | 2024-02-13T07:40:18+00:00 | {"pretty_name": "Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta", "dataset_summary": "Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T07:37:59.900682](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta/blob/main/results_2024-02-13T07-37-59.900682.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5887708742073996,\n \"acc_stderr\": 0.033320056261042716,\n \"acc_norm\": 0.5998480729795278,\n \"acc_norm_stderr\": 0.03416670673744603,\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5757997033411437,\n \"mc2_stderr\": 0.015647081417738417\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5409556313993175,\n \"acc_stderr\": 0.01456229107360123,\n \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6405098585939056,\n \"acc_stderr\": 0.004788703173474751,\n \"acc_norm\": 0.8253335988846843,\n \"acc_norm_stderr\": 0.003789055487003183\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785742,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785742\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454805,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454805\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630804,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630804\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842534,\n \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842534\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.03244305283008732,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.03244305283008732\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7879948914431673,\n \"acc_stderr\": 0.014616099385833676,\n \"acc_norm\": 0.7879948914431673,\n \"acc_norm_stderr\": 0.014616099385833676\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242826,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242826\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n \"acc_stderr\": 0.015183844307206144,\n \"acc_norm\": 0.2905027932960894,\n \"acc_norm_stderr\": 0.015183844307206144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.02718449890994161,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.02718449890994161\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719978,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719978\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n \"acc_stderr\": 0.012638223880313167,\n \"acc_norm\": 0.4282920469361147,\n \"acc_norm_stderr\": 0.012638223880313167\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5996732026143791,\n \"acc_stderr\": 0.01982184368827176,\n \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.01982184368827176\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n \"acc_stderr\": 0.029929415408348377,\n \"acc_norm\": 0.7661691542288557,\n \"acc_norm_stderr\": 0.029929415408348377\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5757997033411437,\n \"mc2_stderr\": 0.015647081417738417\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516156\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.0032820559171369023\n }\n}\n```", "repo_url": "https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-37-59.900682.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["**/details_harness|winogrande|5_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T07-37-59.900682.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T07_37_59.900682", "path": ["results_2024-02-13T07-37-59.900682.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T07-37-59.900682.parquet"]}]}]} | 2024-02-13T07:40:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta
Dataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T07:37:59.900682(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:37:59.900682(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:37:59.900682(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
233,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDChosen-SFTRejected-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T07:37:59.900682(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations"
] |
46b65c7d92991384b950e9f6475de125623bf543 |
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T07:45:36.772955](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta/blob/main/results_2024-02-13T07-45-36.772955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5881347554822653,
"acc_stderr": 0.03323337682315634,
"acc_norm": 0.5985524193468641,
"acc_norm_stderr": 0.03407659901073233,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418187,
"mc2": 0.5809745989468564,
"mc2_stderr": 0.01537123845007581
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.01443413871337998,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670714
},
"harness|hellaswag|10": {
"acc": 0.6483768173670583,
"acc_stderr": 0.004765012078929389,
"acc_norm": 0.8367855008962358,
"acc_norm_stderr": 0.003688059831239015
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.02930010170554965,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.02930010170554965
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936338,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936338
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.02479606060269995,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.02479606060269995
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785742,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785742
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.024985354923102342,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.024985354923102342
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.8,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094634,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094634
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.01513338327898883,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.01513338327898883
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.02552247463212161,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.02552247463212161
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.02705797462449438,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.02705797462449438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037086,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037086
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.01259674410899856,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.01259674410899856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085634,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418187,
"mc2": 0.5809745989468564,
"mc2_stderr": 0.01537123845007581
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207394
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.0035275958887224534
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta | [
"region:us"
] | 2024-02-13T07:47:57+00:00 | {"pretty_name": "Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta", "dataset_summary": "Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T07:45:36.772955](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta/blob/main/results_2024-02-13T07-45-36.772955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5881347554822653,\n \"acc_stderr\": 0.03323337682315634,\n \"acc_norm\": 0.5985524193468641,\n \"acc_norm_stderr\": 0.03407659901073233,\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418187,\n \"mc2\": 0.5809745989468564,\n \"mc2_stderr\": 0.01537123845007581\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.01443413871337998,\n \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670714\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6483768173670583,\n \"acc_stderr\": 0.004765012078929389,\n \"acc_norm\": 0.8367855008962358,\n \"acc_norm_stderr\": 0.003688059831239015\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.5407407407407407,\n \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.02930010170554965,\n \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.02930010170554965\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936338,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936338\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.02479606060269995,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.02479606060269995\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785742,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785742\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.024985354923102342,\n \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.024985354923102342\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n \"acc_stderr\": 0.01513338327898883,\n \"acc_norm\": 0.7662835249042146,\n \"acc_norm_stderr\": 0.01513338327898883\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.02552247463212161,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.02552247463212161\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.02705797462449438,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.02705797462449438\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037086,\n \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037086\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n \"acc_stderr\": 0.01259674410899856,\n \"acc_norm\": 0.4178617992177314,\n \"acc_norm_stderr\": 0.01259674410899856\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085634,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085634\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418187,\n \"mc2\": 0.5809745989468564,\n \"mc2_stderr\": 0.01537123845007581\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207394\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \"acc_stderr\": 0.0035275958887224534\n }\n}\n```", "repo_url": "https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T07-45-36.772955.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["**/details_harness|winogrande|5_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T07-45-36.772955.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T07_45_36.772955", "path": ["results_2024-02-13T07-45-36.772955.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T07-45-36.772955.parquet"]}]}]} | 2024-02-13T07:48:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta
Dataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T07:45:36.772955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:45:36.772955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T07:45:36.772955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
233,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDChosen-SFTRejected-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T07:45:36.772955(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations"
] |
b51eb5a30a49efed3337db2174d98f2f74cb2e30 | # Dataset Card for "find_first_sent_train_400_eval_40_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_first_sent_train_400_eval_40_baseline | [
"region:us"
] | 2024-02-13T07:54:45+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 680916, "num_examples": 400}, {"name": "validation", "num_bytes": 70650, "num_examples": 40}], "download_size": 498501, "dataset_size": 751566}} | 2024-02-13T07:54:52+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "find_first_sent_train_400_eval_40_baseline"
More Information needed | [
"# Dataset Card for \"find_first_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_first_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
6,
29
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_first_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] |
8cf33c6014072888bb0bf735d2bfb369e28d9cab | # Dataset Card for "find_last_sent_train_400_eval_40_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_last_sent_train_400_eval_40_baseline | [
"region:us"
] | 2024-02-13T07:55:01+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 679273, "num_examples": 400}, {"name": "validation", "num_bytes": 70683, "num_examples": 40}], "download_size": 497707, "dataset_size": 749956}} | 2024-02-13T07:55:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "find_last_sent_train_400_eval_40_baseline"
More Information needed | [
"# Dataset Card for \"find_last_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_last_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
6,
28
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_last_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] |
6114b057447009bdb7ebe04bed38207665d56b12 | # Dataset Card for "find_second_sent_train_400_eval_40_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_second_sent_train_400_eval_40_baseline | [
"region:us"
] | 2024-02-13T07:55:18+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 681388, "num_examples": 400}, {"name": "validation", "num_bytes": 70388, "num_examples": 40}], "download_size": 498298, "dataset_size": 751776}} | 2024-02-13T07:55:26+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "find_second_sent_train_400_eval_40_baseline"
More Information needed | [
"# Dataset Card for \"find_second_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_second_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
6,
28
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_second_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] |
6616599e5cfe0f59c3b48e2cc62d8048eb944fec | # Dataset Card for "find_sent_before_sent_train_400_eval_40_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_sent_before_sent_train_400_eval_40_baseline | [
"region:us"
] | 2024-02-13T08:07:14+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3760579, "num_examples": 1994}, {"name": "validation", "num_bytes": 386422, "num_examples": 200}], "download_size": 791972, "dataset_size": 4147001}} | 2024-02-13T08:07:22+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "find_sent_before_sent_train_400_eval_40_baseline"
More Information needed | [
"# Dataset Card for \"find_sent_before_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_sent_before_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
6,
31
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] |
eed1919bcec20e5b72361eea0a2afc430ae162a5 | # Dataset Card for "wsd_myriade_synth_data_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/wsd_myriade_synth_data_v3 | [
"region:us"
] | 2024-02-13T08:07:18+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 57036817, "num_examples": 101321}], "download_size": 0, "dataset_size": 57036817}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T13:38:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "wsd_myriade_synth_data_v3"
More Information needed | [
"# Dataset Card for \"wsd_myriade_synth_data_v3\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"wsd_myriade_synth_data_v3\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"wsd_myriade_synth_data_v3\"\n\nMore Information needed"
] |
f079a93af8bc1fce84c4d7efe347be70daf2688c | # Dataset Card for "find_sent_after_sent_train_400_eval_40_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/find_sent_after_sent_train_400_eval_40_baseline | [
"region:us"
] | 2024-02-13T08:07:29+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3757342, "num_examples": 1994}, {"name": "validation", "num_bytes": 386295, "num_examples": 200}], "download_size": 792704, "dataset_size": 4143637}} | 2024-02-13T08:07:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "find_sent_after_sent_train_400_eval_40_baseline"
More Information needed | [
"# Dataset Card for \"find_sent_after_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"find_sent_after_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] | [
6,
31
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_400_eval_40_baseline\"\n\nMore Information needed"
] |
4c84d503e2e186812cfb9f792e0aaad274693312 |

The data set is a merge of other open datasets:
- [wmt19](https://huggingface.co/datasets/wmt19) (lt-en)
- [opus100](https://huggingface.co/datasets/opus100) (en-lt)
- [sentence-transformers/parallel-sentences](https://huggingface.co/datasets/sentence-transformers/parallel-sentences)
- Europarl-en-lt-train.tsv.gz
- JW300-en-lt-train.tsv.gz
- OpenSubtitles-en-lt-train.tsv.gz
- Talks-en-lt-train.tsv.gz
- Tatoeba-en-lt-train.tsv.gz
- WikiMatrix-en-lt-train.tsv.gz
- Custom [Scoris](https://scoris.lt) data set translated using Deepl.
Basic clean-up and deduplication was applied when creating this set
This can be used to train Lithuanian-English-Lithuanian MT Seq2Seq models.
Made by [Scoris](https://scoris.lt) team
You can use this in the following way:
```python
from datasets import load_dataset
dataset_name = "scoris/en-lt-merged-data"
# Load the dataset
dataset = load_dataset(dataset_name)
# Accessing data
# Display the first example from the training set
print("First training example:", dataset['train'][0])
# Display the first example from the validation set
print("First validation example:", dataset['validation'][0])
# Iterate through a few examples from the training set
for i, example in enumerate(dataset['train']):
if i < 5:
print(f"Training example {i}:", example)
else:
break
# If you want to use the dataset in a machine learning model, you can directly
# iterate over the dataset or convert it to a pandas DataFrame for analysis
import pandas as pd
# Convert the training set to a pandas DataFrame
train_df = pd.DataFrame(dataset['train'])
print(train_df.head())
``` | scoris/en-lt-merged-data | [
"size_categories:1M<n<10M",
"language:lt",
"language:en",
"license:cc-by-2.5",
"region:us"
] | 2024-02-13T08:25:25+00:00 | {"language": ["lt", "en"], "license": "cc-by-2.5", "size_categories": ["1M<n<10M"], "dataset_info": {"features": [{"name": "translation", "struct": [{"name": "en", "dtype": "string"}, {"name": "lt", "dtype": "string"}]}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 945130215, "num_examples": 5422278}, {"name": "validation", "num_bytes": 9521400, "num_examples": 54771}], "download_size": 719193731, "dataset_size": 954651615}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-15T08:14:47+00:00 | [] | [
"lt",
"en"
] | TAGS
#size_categories-1M<n<10M #language-Lithuanian #language-English #license-cc-by-2.5 #region-us
|
!Scoris logo
The data set is a merge of other open datasets:
- wmt19 (lt-en)
- opus100 (en-lt)
- sentence-transformers/parallel-sentences
- URL
- URL
- URL
- URL
- URL
- URL
- Custom Scoris data set translated using Deepl.
Basic clean-up and deduplication was applied when creating this set
This can be used to train Lithuanian-English-Lithuanian MT Seq2Seq models.
Made by Scoris team
You can use this in the following way:
| [] | [
"TAGS\n#size_categories-1M<n<10M #language-Lithuanian #language-English #license-cc-by-2.5 #region-us \n"
] | [
38
] | [
"passage: TAGS\n#size_categories-1M<n<10M #language-Lithuanian #language-English #license-cc-by-2.5 #region-us \n"
] |
a0b99a3128d07b239e4530a825ab3442cc3124fe |
parquet generation code at: https://github.com/maagic6/fyp_project/blob/main/data/convert_parquet.py | maagic6/fyp_dataset | [
"task_categories:image-to-image",
"language:en",
"region:us"
] | 2024-02-13T08:49:03+00:00 | {"language": ["en"], "task_categories": ["image-to-image"], "dataset_info": {"features": [{"name": "before", "dtype": "image"}, {"name": "instruction", "dtype": "string"}, {"name": "after", "dtype": "image"}]}} | 2024-02-13T09:44:15+00:00 | [] | [
"en"
] | TAGS
#task_categories-image-to-image #language-English #region-us
|
parquet generation code at: URL | [] | [
"TAGS\n#task_categories-image-to-image #language-English #region-us \n"
] | [
22
] | [
"passage: TAGS\n#task_categories-image-to-image #language-English #region-us \n"
] |
35ab1efbb95d02b785889070fd08804b9828f1f0 |
# GooAQ (Google Answers to Google Questions) question-answer pairs in Danish
## About
This dataset is a version of the [GooAQ question-answer pairs dataset](https://huggingface.co/datasets/sentence-transformers/embedding-training-data) machine-translated from English to Danish ([link to original dataset](https://github.com/allenai/gooaq)).
Machine translation is performed using the Helsinki NLP [English-to-Danish OPUS-MT model](https://huggingface.co/Helsinki-NLP/opus-mt-en-da).
The dataset contains ~3M question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').
## Usage
Using the HuggingFace datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("KennethTM/gooaq_pairs_danish")
``` | KennethTM/gooaq_pairs_danish | [
"task_categories:feature-extraction",
"task_categories:question-answering",
"size_categories:1M<n<10M",
"language:da",
"license:apache-2.0",
"region:us"
] | 2024-02-13T09:02:22+00:00 | {"language": ["da"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["feature-extraction", "question-answering"], "dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "passage", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 934643435, "num_examples": 3012496}], "download_size": 627593528, "dataset_size": 934643435}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-13T09:08:46+00:00 | [] | [
"da"
] | TAGS
#task_categories-feature-extraction #task_categories-question-answering #size_categories-1M<n<10M #language-Danish #license-apache-2.0 #region-us
|
# GooAQ (Google Answers to Google Questions) question-answer pairs in Danish
## About
This dataset is a version of the GooAQ question-answer pairs dataset machine-translated from English to Danish (link to original dataset).
Machine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model.
The dataset contains ~3M question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').
## Usage
Using the HuggingFace datasets library:
| [
"# GooAQ (Google Answers to Google Questions) question-answer pairs in Danish",
"## About\n\nThis dataset is a version of the GooAQ question-answer pairs dataset machine-translated from English to Danish (link to original dataset).\n\nMachine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model.\n\nThe dataset contains ~3M question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').",
"## Usage\n\nUsing the HuggingFace datasets library:"
] | [
"TAGS\n#task_categories-feature-extraction #task_categories-question-answering #size_categories-1M<n<10M #language-Danish #license-apache-2.0 #region-us \n",
"# GooAQ (Google Answers to Google Questions) question-answer pairs in Danish",
"## About\n\nThis dataset is a version of the GooAQ question-answer pairs dataset machine-translated from English to Danish (link to original dataset).\n\nMachine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model.\n\nThe dataset contains ~3M question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').",
"## Usage\n\nUsing the HuggingFace datasets library:"
] | [
55,
20,
114,
16
] | [
"passage: TAGS\n#task_categories-feature-extraction #task_categories-question-answering #size_categories-1M<n<10M #language-Danish #license-apache-2.0 #region-us \n# GooAQ (Google Answers to Google Questions) question-answer pairs in Danish## About\n\nThis dataset is a version of the GooAQ question-answer pairs dataset machine-translated from English to Danish (link to original dataset).\n\nMachine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model.\n\nThe dataset contains ~3M question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').## Usage\n\nUsing the HuggingFace datasets library:"
] |
22bb58c603ce3ec02a48a546c5ce4464bec676da |
# Dataset Card for Evaluation run of AbacusResearch/haLLAwa3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AbacusResearch/haLLAwa3](https://huggingface.co/AbacusResearch/haLLAwa3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AbacusResearch__haLLAwa3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T09:17:16.010723](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__haLLAwa3/blob/main/results_2024-02-13T09-17-16.010723.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6460967328473372,
"acc_stderr": 0.032133601190841035,
"acc_norm": 0.6467910792032593,
"acc_norm_stderr": 0.03278484291209426,
"mc1": 0.4602203182374541,
"mc1_stderr": 0.017448017223960884,
"mc2": 0.6371489024278089,
"mc2_stderr": 0.015333679187240897
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620197,
"acc_norm": 0.6783276450511946,
"acc_norm_stderr": 0.013650488084494162
},
"harness|hellaswag|10": {
"acc": 0.7039434375622386,
"acc_stderr": 0.004555832462774594,
"acc_norm": 0.8702449711212906,
"acc_norm_stderr": 0.003353469625027664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.037161774375660185,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.037161774375660185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055263,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055263
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47262569832402235,
"acc_stderr": 0.016697420650642752,
"acc_norm": 0.47262569832402235,
"acc_norm_stderr": 0.016697420650642752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675602,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675602
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4602203182374541,
"mc1_stderr": 0.017448017223960884,
"mc2": 0.6371489024278089,
"mc2_stderr": 0.015333679187240897
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938282
},
"harness|gsm8k|5": {
"acc": 0.6474601971190296,
"acc_stderr": 0.013159909755930333
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AbacusResearch__haLLAwa3 | [
"region:us"
] | 2024-02-13T09:19:35+00:00 | {"pretty_name": "Evaluation run of AbacusResearch/haLLAwa3", "dataset_summary": "Dataset automatically created during the evaluation run of model [AbacusResearch/haLLAwa3](https://huggingface.co/AbacusResearch/haLLAwa3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AbacusResearch__haLLAwa3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T09:17:16.010723](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__haLLAwa3/blob/main/results_2024-02-13T09-17-16.010723.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6460967328473372,\n \"acc_stderr\": 0.032133601190841035,\n \"acc_norm\": 0.6467910792032593,\n \"acc_norm_stderr\": 0.03278484291209426,\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.017448017223960884,\n \"mc2\": 0.6371489024278089,\n \"mc2_stderr\": 0.015333679187240897\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620197,\n \"acc_norm\": 0.6783276450511946,\n \"acc_norm_stderr\": 0.013650488084494162\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7039434375622386,\n \"acc_stderr\": 0.004555832462774594,\n \"acc_norm\": 0.8702449711212906,\n \"acc_norm_stderr\": 0.003353469625027664\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.037161774375660185,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.037161774375660185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055263,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055263\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47262569832402235,\n \"acc_stderr\": 0.016697420650642752,\n \"acc_norm\": 0.47262569832402235,\n \"acc_norm_stderr\": 0.016697420650642752\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675602,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675602\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.017448017223960884,\n \"mc2\": 0.6371489024278089,\n \"mc2_stderr\": 0.015333679187240897\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6474601971190296,\n \"acc_stderr\": 0.013159909755930333\n }\n}\n```", "repo_url": "https://huggingface.co/AbacusResearch/haLLAwa3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|arc:challenge|25_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|gsm8k|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hellaswag|10_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T09-17-16.010723.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["**/details_harness|winogrande|5_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T09-17-16.010723.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T09_17_16.010723", "path": ["results_2024-02-13T09-17-16.010723.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T09-17-16.010723.parquet"]}]}]} | 2024-02-13T09:19:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AbacusResearch/haLLAwa3
Dataset automatically created during the evaluation run of model AbacusResearch/haLLAwa3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T09:17:16.010723(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AbacusResearch/haLLAwa3\n\n\n\nDataset automatically created during the evaluation run of model AbacusResearch/haLLAwa3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T09:17:16.010723(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AbacusResearch/haLLAwa3\n\n\n\nDataset automatically created during the evaluation run of model AbacusResearch/haLLAwa3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T09:17:16.010723(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AbacusResearch/haLLAwa3\n\n\n\nDataset automatically created during the evaluation run of model AbacusResearch/haLLAwa3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T09:17:16.010723(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
23ca50e94b4bd27724c731aa94f3d69d618bd33a |
# Dataset Card for Evaluation run of kevin009/babyllama-v0.6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/babyllama-v0.6](https://huggingface.co/kevin009/babyllama-v0.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__babyllama-v0.6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T10:06:30.565512](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__babyllama-v0.6/blob/main/results_2024-02-13T10-06-30.565512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26066347798834766,
"acc_stderr": 0.030904794820091792,
"acc_norm": 0.26161932329960463,
"acc_norm_stderr": 0.0316608342460649,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487295,
"mc2": 0.3584100057903431,
"mc2_stderr": 0.013776314892170112
},
"harness|arc:challenge|25": {
"acc": 0.35238907849829354,
"acc_stderr": 0.013960142600598677,
"acc_norm": 0.3609215017064846,
"acc_norm_stderr": 0.014034761386175458
},
"harness|hellaswag|10": {
"acc": 0.46335391356303524,
"acc_stderr": 0.004976361454341339,
"acc_norm": 0.6159131647082254,
"acc_norm_stderr": 0.0048538457503921415
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.03302789859901717,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.03302789859901717
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677077,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677077
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788147,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788147
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.025091892378859275,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.025091892378859275
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03010833071801162,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03010833071801162
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423088,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423088
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507384,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507384
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863818,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863818
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473836,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473836
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.280970625798212,
"acc_stderr": 0.01607312785122125,
"acc_norm": 0.280970625798212,
"acc_norm_stderr": 0.01607312785122125
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22681564245810057,
"acc_stderr": 0.014005843570897897,
"acc_norm": 0.22681564245810057,
"acc_norm_stderr": 0.014005843570897897
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.02512263760881665,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.02512263760881665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.0108859297420022,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.0108859297420022
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667192,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667192
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947861,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947861
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487295,
"mc2": 0.3584100057903431,
"mc2_stderr": 0.013776314892170112
},
"harness|winogrande|5": {
"acc": 0.6101026045777427,
"acc_stderr": 0.013707547317008463
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890015
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kevin009__babyllama-v0.6 | [
"region:us"
] | 2024-02-13T10:08:16+00:00 | {"pretty_name": "Evaluation run of kevin009/babyllama-v0.6", "dataset_summary": "Dataset automatically created during the evaluation run of model [kevin009/babyllama-v0.6](https://huggingface.co/kevin009/babyllama-v0.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__babyllama-v0.6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T10:06:30.565512](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__babyllama-v0.6/blob/main/results_2024-02-13T10-06-30.565512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26066347798834766,\n \"acc_stderr\": 0.030904794820091792,\n \"acc_norm\": 0.26161932329960463,\n \"acc_norm_stderr\": 0.0316608342460649,\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487295,\n \"mc2\": 0.3584100057903431,\n \"mc2_stderr\": 0.013776314892170112\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.35238907849829354,\n \"acc_stderr\": 0.013960142600598677,\n \"acc_norm\": 0.3609215017064846,\n \"acc_norm_stderr\": 0.014034761386175458\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46335391356303524,\n \"acc_stderr\": 0.004976361454341339,\n \"acc_norm\": 0.6159131647082254,\n \"acc_norm_stderr\": 0.0048538457503921415\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n \"acc_stderr\": 0.03302789859901717,\n \"acc_norm\": 0.17777777777777778,\n \"acc_norm_stderr\": 0.03302789859901717\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677077,\n \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677077\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n \"acc_stderr\": 0.030299574664788147,\n \"acc_norm\": 0.19653179190751446,\n \"acc_norm_stderr\": 0.030299574664788147\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n \"acc_stderr\": 0.025091892378859275,\n \"acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.025091892378859275\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03010833071801162,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03010833071801162\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423088,\n \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423088\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507384,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507384\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863818,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863818\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473836,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473836\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.0283046579430353,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.0283046579430353\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.3632286995515695,\n \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n \"acc_stderr\": 0.01607312785122125,\n \"acc_norm\": 0.280970625798212,\n \"acc_norm_stderr\": 0.01607312785122125\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22681564245810057,\n \"acc_stderr\": 0.014005843570897897,\n \"acc_norm\": 0.22681564245810057,\n \"acc_norm_stderr\": 0.014005843570897897\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n \"acc_stderr\": 0.02512263760881665,\n \"acc_norm\": 0.26688102893890675,\n \"acc_norm_stderr\": 0.02512263760881665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n \"acc_stderr\": 0.0108859297420022,\n \"acc_norm\": 0.23859191655801826,\n \"acc_norm_stderr\": 0.0108859297420022\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667192,\n \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667192\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487295,\n \"mc2\": 0.3584100057903431,\n \"mc2_stderr\": 0.013776314892170112\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6101026045777427,\n \"acc_stderr\": 0.013707547317008463\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \"acc_stderr\": 0.0034478192723890015\n }\n}\n```", "repo_url": "https://huggingface.co/kevin009/babyllama-v0.6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|arc:challenge|25_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|gsm8k|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hellaswag|10_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T10-06-30.565512.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["**/details_harness|winogrande|5_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T10-06-30.565512.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T10_06_30.565512", "path": ["results_2024-02-13T10-06-30.565512.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T10-06-30.565512.parquet"]}]}]} | 2024-02-13T10:08:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kevin009/babyllama-v0.6
Dataset automatically created during the evaluation run of model kevin009/babyllama-v0.6 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T10:06:30.565512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kevin009/babyllama-v0.6\n\n\n\nDataset automatically created during the evaluation run of model kevin009/babyllama-v0.6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T10:06:30.565512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kevin009/babyllama-v0.6\n\n\n\nDataset automatically created during the evaluation run of model kevin009/babyllama-v0.6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T10:06:30.565512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kevin009/babyllama-v0.6\n\n\n\nDataset automatically created during the evaluation run of model kevin009/babyllama-v0.6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T10:06:30.565512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
7d4ea267a86736b347c482f95fc8fd0394c4bceb |
We introduced this dataset in Points2Surf, a method that turns point clouds into meshes.
It consists of objects from the [_ABC Dataset_](https://paperswithcode.com/dataset/abc-dataset-1), a collection of _Famous_ meshes and objects from [_Thingi10k_](https://paperswithcode.com/dataset/thingi10k).
These are mostly single objects per file, sometimes a couple of disconnected objects. Objects from the _ABC Dataset_ are CAD-models, the others are mostly statues with organic structures.
We created realistic point clouds using a simulated time-of-flight sensor from [_BlenSor_](https://www.blensor.org/). The point clouds have typical artifacts like noise and scan shadows.
Finally, we created training data consisting of randomly sampled query points with their ground-truth signed distance. The query points are 50% uniformly distributed in the unit cube and 50% near the surface with some random offset.
The training set consists of 4950 _ABC_ objects with varying number of scans and noise strength.
The validation sets are the same as the test set.
The _ABC_ test sets contain 100 objects, _Famous_ 22 and _Thingi10k_ 100. The test set variants are as follows:
(1) _ABC_ var (like training set), no noise, strong noise;
(2) _Famous_ no noise, medium noise, strong noise, sparse, dense scans;
(3) _Thingi10k_ no noise, medium noise, strong noise, sparse, dense scans | perler/ppsurf | [
"task_categories:summarization",
"size_categories:1K<n<10K",
"language:en",
"3d meshes",
"point clouds",
"synthetic",
"realistic",
"CAD",
"statues",
"region:us"
] | 2024-02-13T10:22:13+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["summarization"], "pretty_name": "Points2Surf Dataset", "tags": ["3d meshes", "point clouds", "synthetic", "realistic", "CAD", "statues"]} | 2024-02-15T11:47:11+00:00 | [] | [
"en"
] | TAGS
#task_categories-summarization #size_categories-1K<n<10K #language-English #3d meshes #point clouds #synthetic #realistic #CAD #statues #region-us
|
We introduced this dataset in Points2Surf, a method that turns point clouds into meshes.
It consists of objects from the _ABC Dataset_, a collection of _Famous_ meshes and objects from _Thingi10k_.
These are mostly single objects per file, sometimes a couple of disconnected objects. Objects from the _ABC Dataset_ are CAD-models, the others are mostly statues with organic structures.
We created realistic point clouds using a simulated time-of-flight sensor from _BlenSor_. The point clouds have typical artifacts like noise and scan shadows.
Finally, we created training data consisting of randomly sampled query points with their ground-truth signed distance. The query points are 50% uniformly distributed in the unit cube and 50% near the surface with some random offset.
The training set consists of 4950 _ABC_ objects with varying number of scans and noise strength.
The validation sets are the same as the test set.
The _ABC_ test sets contain 100 objects, _Famous_ 22 and _Thingi10k_ 100. The test set variants are as follows:
(1) _ABC_ var (like training set), no noise, strong noise;
(2) _Famous_ no noise, medium noise, strong noise, sparse, dense scans;
(3) _Thingi10k_ no noise, medium noise, strong noise, sparse, dense scans | [] | [
"TAGS\n#task_categories-summarization #size_categories-1K<n<10K #language-English #3d meshes #point clouds #synthetic #realistic #CAD #statues #region-us \n"
] | [
52
] | [
"passage: TAGS\n#task_categories-summarization #size_categories-1K<n<10K #language-English #3d meshes #point clouds #synthetic #realistic #CAD #statues #region-us \n"
] |
bb3b1e2d4d9fee2a83d538096a7409bf4e305d21 |
# Dataset Card for Evaluation run of saishf/West-Maid-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saishf/West-Maid-7B](https://huggingface.co/saishf/West-Maid-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saishf__West-Maid-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T10:56:41.095810](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__West-Maid-7B/blob/main/results_2024-02-13T10-56-41.095810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511022834562965,
"acc_stderr": 0.03201231022208247,
"acc_norm": 0.6525220270281931,
"acc_norm_stderr": 0.032663176772250904,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5100402157242979,
"mc2_stderr": 0.015147927272675395
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491894,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719339
},
"harness|hellaswag|10": {
"acc": 0.6742680740888269,
"acc_stderr": 0.00467689886197891,
"acc_norm": 0.8643696474805815,
"acc_norm_stderr": 0.003416958591324802
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813822,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813822
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634285,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258165,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258165
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.01615591072134177,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.01615591072134177
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214963,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5100402157242979,
"mc2_stderr": 0.015147927272675395
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.01062696452997185
},
"harness|gsm8k|5": {
"acc": 0.623199393479909,
"acc_stderr": 0.013347858757829154
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_saishf__West-Maid-7B | [
"region:us"
] | 2024-02-13T10:58:59+00:00 | {"pretty_name": "Evaluation run of saishf/West-Maid-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [saishf/West-Maid-7B](https://huggingface.co/saishf/West-Maid-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saishf__West-Maid-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T10:56:41.095810](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__West-Maid-7B/blob/main/results_2024-02-13T10-56-41.095810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511022834562965,\n \"acc_stderr\": 0.03201231022208247,\n \"acc_norm\": 0.6525220270281931,\n \"acc_norm_stderr\": 0.032663176772250904,\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5100402157242979,\n \"mc2_stderr\": 0.015147927272675395\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491894,\n \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719339\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6742680740888269,\n \"acc_stderr\": 0.00467689886197891,\n \"acc_norm\": 0.8643696474805815,\n \"acc_norm_stderr\": 0.003416958591324802\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813822,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813822\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634285,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258165,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258165\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n \"acc_stderr\": 0.01615591072134177,\n \"acc_norm\": 0.37094972067039106,\n \"acc_norm_stderr\": 0.01615591072134177\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5100402157242979,\n \"mc2_stderr\": 0.015147927272675395\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.01062696452997185\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.623199393479909,\n \"acc_stderr\": 0.013347858757829154\n }\n}\n```", "repo_url": "https://huggingface.co/saishf/West-Maid-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|arc:challenge|25_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|gsm8k|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hellaswag|10_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T10-56-41.095810.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["**/details_harness|winogrande|5_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T10-56-41.095810.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T10_56_41.095810", "path": ["results_2024-02-13T10-56-41.095810.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T10-56-41.095810.parquet"]}]}]} | 2024-02-13T10:59:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of saishf/West-Maid-7B
Dataset automatically created during the evaluation run of model saishf/West-Maid-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T10:56:41.095810(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of saishf/West-Maid-7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/West-Maid-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T10:56:41.095810(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of saishf/West-Maid-7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/West-Maid-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T10:56:41.095810(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of saishf/West-Maid-7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/West-Maid-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T10:56:41.095810(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
6da7ef0e0751e56e1c5f943b7ff3a2ebbd6e188e |
# Dataset Card for Evaluation run of saishf/Kuno-Lake-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saishf/Kuno-Lake-7B](https://huggingface.co/saishf/Kuno-Lake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saishf__Kuno-Lake-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T11:02:51.081639](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Kuno-Lake-7B/blob/main/results_2024-02-13T11-02-51.081639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527256027565543,
"acc_stderr": 0.03206544874440503,
"acc_norm": 0.6527481253656698,
"acc_norm_stderr": 0.03272966202308312,
"mc1": 0.5250917992656059,
"mc1_stderr": 0.017481446804104014,
"mc2": 0.6683041706847399,
"mc2_stderr": 0.015246173402699752
},
"harness|arc:challenge|25": {
"acc": 0.689419795221843,
"acc_stderr": 0.013522292098053052,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009022
},
"harness|hellaswag|10": {
"acc": 0.7165903206532563,
"acc_stderr": 0.004497325533959636,
"acc_norm": 0.8814977096195977,
"acc_norm_stderr": 0.0032254141192897138
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.0154808268653743,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.0154808268653743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128136,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066309,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066309
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525818,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47374301675977654,
"acc_stderr": 0.01669942767278476,
"acc_norm": 0.47374301675977654,
"acc_norm_stderr": 0.01669942767278476
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083141,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083141
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5250917992656059,
"mc1_stderr": 0.017481446804104014,
"mc2": 0.6683041706847399,
"mc2_stderr": 0.015246173402699752
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.6535253980288097,
"acc_stderr": 0.013107179054313411
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_saishf__Kuno-Lake-7B | [
"region:us"
] | 2024-02-13T11:05:12+00:00 | {"pretty_name": "Evaluation run of saishf/Kuno-Lake-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [saishf/Kuno-Lake-7B](https://huggingface.co/saishf/Kuno-Lake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saishf__Kuno-Lake-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T11:02:51.081639](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Kuno-Lake-7B/blob/main/results_2024-02-13T11-02-51.081639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527256027565543,\n \"acc_stderr\": 0.03206544874440503,\n \"acc_norm\": 0.6527481253656698,\n \"acc_norm_stderr\": 0.03272966202308312,\n \"mc1\": 0.5250917992656059,\n \"mc1_stderr\": 0.017481446804104014,\n \"mc2\": 0.6683041706847399,\n \"mc2_stderr\": 0.015246173402699752\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.689419795221843,\n \"acc_stderr\": 0.013522292098053052,\n \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7165903206532563,\n \"acc_stderr\": 0.004497325533959636,\n \"acc_norm\": 0.8814977096195977,\n \"acc_norm_stderr\": 0.0032254141192897138\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.0154808268653743,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.0154808268653743\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066309,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066309\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525818,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525818\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47374301675977654,\n \"acc_stderr\": 0.01669942767278476,\n \"acc_norm\": 0.47374301675977654,\n \"acc_norm_stderr\": 0.01669942767278476\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083141,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083141\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5250917992656059,\n \"mc1_stderr\": 0.017481446804104014,\n \"mc2\": 0.6683041706847399,\n \"mc2_stderr\": 0.015246173402699752\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6535253980288097,\n \"acc_stderr\": 0.013107179054313411\n }\n}\n```", "repo_url": "https://huggingface.co/saishf/Kuno-Lake-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-02-51.081639.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["**/details_harness|winogrande|5_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T11-02-51.081639.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T11_02_51.081639", "path": ["results_2024-02-13T11-02-51.081639.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T11-02-51.081639.parquet"]}]}]} | 2024-02-13T11:05:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of saishf/Kuno-Lake-7B
Dataset automatically created during the evaluation run of model saishf/Kuno-Lake-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T11:02:51.081639(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of saishf/Kuno-Lake-7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Kuno-Lake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:02:51.081639(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of saishf/Kuno-Lake-7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Kuno-Lake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:02:51.081639(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of saishf/Kuno-Lake-7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Kuno-Lake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T11:02:51.081639(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
5f9e0893c4433acb9d126d2fdb7ef722f6a466a2 |
# Dataset Card for Evaluation run of saishf/Kuro-Lotus-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saishf/Kuro-Lotus-10.7B](https://huggingface.co/saishf/Kuro-Lotus-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saishf__Kuro-Lotus-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T11:03:22.904872](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Kuro-Lotus-10.7B/blob/main/results_2024-02-13T11-03-22.904872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6694085008188662,
"acc_stderr": 0.03144841125695069,
"acc_norm": 0.6702869793586165,
"acc_norm_stderr": 0.032092139305259296,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.5826555768468422,
"mc2_stderr": 0.015676012670254088
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.013855831287497728,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623494
},
"harness|hellaswag|10": {
"acc": 0.6870145389364668,
"acc_stderr": 0.004627607991626913,
"acc_norm": 0.8751244771957777,
"acc_norm_stderr": 0.003299021089089749
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810535,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810535
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603915,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360754,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360754
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.023454674889404295,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.023454674889404295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700486,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700486
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997865,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997865
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.869198312236287,
"acc_stderr": 0.02194876605947076,
"acc_norm": 0.869198312236287,
"acc_norm_stderr": 0.02194876605947076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822915,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822915
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077816,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077816
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368976,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368976
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340863,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340863
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008557,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008557
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49869621903520206,
"acc_stderr": 0.012770192691057112,
"acc_norm": 0.49869621903520206,
"acc_norm_stderr": 0.012770192691057112
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.01855063450295296,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.01855063450295296
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073153,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.5826555768468422,
"mc2_stderr": 0.015676012670254088
},
"harness|winogrande|5": {
"acc": 0.8421468034727704,
"acc_stderr": 0.010247165248719763
},
"harness|gsm8k|5": {
"acc": 0.6611068991660348,
"acc_stderr": 0.013037955768562507
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_saishf__Kuro-Lotus-10.7B | [
"region:us"
] | 2024-02-13T11:05:37+00:00 | {"pretty_name": "Evaluation run of saishf/Kuro-Lotus-10.7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [saishf/Kuro-Lotus-10.7B](https://huggingface.co/saishf/Kuro-Lotus-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saishf__Kuro-Lotus-10.7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T11:03:22.904872](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Kuro-Lotus-10.7B/blob/main/results_2024-02-13T11-03-22.904872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6694085008188662,\n \"acc_stderr\": 0.03144841125695069,\n \"acc_norm\": 0.6702869793586165,\n \"acc_norm_stderr\": 0.032092139305259296,\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5826555768468422,\n \"mc2_stderr\": 0.015676012670254088\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497728,\n \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623494\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6870145389364668,\n \"acc_stderr\": 0.004627607991626913,\n \"acc_norm\": 0.8751244771957777,\n \"acc_norm_stderr\": 0.003299021089089749\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810535,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810535\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603915,\n \"acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360754,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360754\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.023454674889404295,\n \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.023454674889404295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135353,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135353\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700486,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700486\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.869198312236287,\n \"acc_stderr\": 0.02194876605947076,\n \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.02194876605947076\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137276,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137276\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077816,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077816\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368976,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368976\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340863,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340863\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008557,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008557\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49869621903520206,\n \"acc_stderr\": 0.012770192691057112,\n \"acc_norm\": 0.49869621903520206,\n \"acc_norm_stderr\": 0.012770192691057112\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.01855063450295296,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.01855063450295296\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073153,\n \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073153\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5826555768468422,\n \"mc2_stderr\": 0.015676012670254088\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6611068991660348,\n \"acc_stderr\": 0.013037955768562507\n }\n}\n```", "repo_url": "https://huggingface.co/saishf/Kuro-Lotus-10.7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-03-22.904872.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["**/details_harness|winogrande|5_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T11-03-22.904872.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T11_03_22.904872", "path": ["results_2024-02-13T11-03-22.904872.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T11-03-22.904872.parquet"]}]}]} | 2024-02-13T11:05:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of saishf/Kuro-Lotus-10.7B
Dataset automatically created during the evaluation run of model saishf/Kuro-Lotus-10.7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T11:03:22.904872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of saishf/Kuro-Lotus-10.7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Kuro-Lotus-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:03:22.904872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of saishf/Kuro-Lotus-10.7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Kuro-Lotus-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:03:22.904872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of saishf/Kuro-Lotus-10.7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Kuro-Lotus-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T11:03:22.904872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
99052f27b8105c1abf938a00a2466bedc261c1f9 |
# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.12
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SF-Foundation/Ein-72B-v0.12](https://huggingface.co/SF-Foundation/Ein-72B-v0.12) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T11:06:19.237402](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12/blob/main/results_2024-02-13T11-06-19.237402.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7720004576068558,
"acc_stderr": 0.028018920061937066,
"acc_norm": 0.77366212968727,
"acc_norm_stderr": 0.028576972189266775,
"mc1": 0.6597307221542228,
"mc1_stderr": 0.016586304901762553,
"mc2": 0.7778465654225306,
"mc2_stderr": 0.013819882710780051
},
"harness|arc:challenge|25": {
"acc": 0.7406143344709898,
"acc_stderr": 0.01280827357392709,
"acc_norm": 0.7619453924914675,
"acc_norm_stderr": 0.0124457700280262
},
"harness|hellaswag|10": {
"acc": 0.7251543517227644,
"acc_stderr": 0.004455240755811573,
"acc_norm": 0.8946425014937264,
"acc_norm_stderr": 0.003063860621772738
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8377358490566038,
"acc_stderr": 0.02269148287203535,
"acc_norm": 0.8377358490566038,
"acc_norm_stderr": 0.02269148287203535
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9375,
"acc_stderr": 0.02024219611347799,
"acc_norm": 0.9375,
"acc_norm_stderr": 0.02024219611347799
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.0180036033258636,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.0180036033258636
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.016999994927421592,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.016999994927421592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084315,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.01988016540658877,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.01988016540658877
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45555555555555555,
"acc_stderr": 0.03036486250482443,
"acc_norm": 0.45555555555555555,
"acc_norm_stderr": 0.03036486250482443
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5562913907284768,
"acc_stderr": 0.04056527902281732,
"acc_norm": 0.5562913907284768,
"acc_norm_stderr": 0.04056527902281732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9357798165137615,
"acc_stderr": 0.010510494713201403,
"acc_norm": 0.9357798165137615,
"acc_norm_stderr": 0.010510494713201403
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6898148148148148,
"acc_stderr": 0.03154696285656627,
"acc_norm": 0.6898148148148148,
"acc_norm_stderr": 0.03154696285656627
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446914,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446914
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9195402298850575,
"acc_stderr": 0.009726831316141866,
"acc_norm": 0.9195402298850575,
"acc_norm_stderr": 0.009726831316141866
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.019685307033571946,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.019685307033571946
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6960893854748603,
"acc_stderr": 0.015382845587584517,
"acc_norm": 0.6960893854748603,
"acc_norm_stderr": 0.015382845587584517
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.02046417512433263,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.02046417512433263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8456591639871383,
"acc_stderr": 0.02051905034208471,
"acc_norm": 0.8456591639871383,
"acc_norm_stderr": 0.02051905034208471
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.019061588181505405,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.019061588181505405
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6631205673758865,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.6631205673758865,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6082138200782269,
"acc_stderr": 0.012467564418145118,
"acc_norm": 0.6082138200782269,
"acc_norm_stderr": 0.012467564418145118
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.022368672562886747,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.022368672562886747
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273337,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273337
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534094,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276894,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276894
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6597307221542228,
"mc1_stderr": 0.016586304901762553,
"mc2": 0.7778465654225306,
"mc2_stderr": 0.013819882710780051
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775778
},
"harness|gsm8k|5": {
"acc": 0.7922668688400303,
"acc_stderr": 0.011174572716705886
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12 | [
"region:us"
] | 2024-02-13T11:08:28+00:00 | {"pretty_name": "Evaluation run of SF-Foundation/Ein-72B-v0.12", "dataset_summary": "Dataset automatically created during the evaluation run of model [SF-Foundation/Ein-72B-v0.12](https://huggingface.co/SF-Foundation/Ein-72B-v0.12) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T11:06:19.237402](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12/blob/main/results_2024-02-13T11-06-19.237402.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7720004576068558,\n \"acc_stderr\": 0.028018920061937066,\n \"acc_norm\": 0.77366212968727,\n \"acc_norm_stderr\": 0.028576972189266775,\n \"mc1\": 0.6597307221542228,\n \"mc1_stderr\": 0.016586304901762553,\n \"mc2\": 0.7778465654225306,\n \"mc2_stderr\": 0.013819882710780051\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7406143344709898,\n \"acc_stderr\": 0.01280827357392709,\n \"acc_norm\": 0.7619453924914675,\n \"acc_norm_stderr\": 0.0124457700280262\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7251543517227644,\n \"acc_stderr\": 0.004455240755811573,\n \"acc_norm\": 0.8946425014937264,\n \"acc_norm_stderr\": 0.003063860621772738\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8377358490566038,\n \"acc_stderr\": 0.02269148287203535,\n \"acc_norm\": 0.8377358490566038,\n \"acc_norm_stderr\": 0.02269148287203535\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9375,\n \"acc_stderr\": 0.02024219611347799,\n \"acc_norm\": 0.9375,\n \"acc_norm_stderr\": 0.02024219611347799\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.03455930201924811,\n \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.03455930201924811\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8870967741935484,\n \"acc_stderr\": 0.0180036033258636,\n \"acc_norm\": 0.8870967741935484,\n \"acc_norm_stderr\": 0.0180036033258636\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.016999994927421592,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.016999994927421592\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084315,\n \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084315\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.01988016540658877,\n \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.01988016540658877\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45555555555555555,\n \"acc_stderr\": 0.03036486250482443,\n \"acc_norm\": 0.45555555555555555,\n \"acc_norm_stderr\": 0.03036486250482443\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5562913907284768,\n \"acc_stderr\": 0.04056527902281732,\n \"acc_norm\": 0.5562913907284768,\n \"acc_norm_stderr\": 0.04056527902281732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9357798165137615,\n \"acc_stderr\": 0.010510494713201403,\n \"acc_norm\": 0.9357798165137615,\n \"acc_norm_stderr\": 0.010510494713201403\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6898148148148148,\n \"acc_stderr\": 0.03154696285656627,\n \"acc_norm\": 0.6898148148148148,\n \"acc_norm_stderr\": 0.03154696285656627\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446914,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446914\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9195402298850575,\n \"acc_stderr\": 0.009726831316141866,\n \"acc_norm\": 0.9195402298850575,\n \"acc_norm_stderr\": 0.009726831316141866\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6960893854748603,\n \"acc_stderr\": 0.015382845587584517,\n \"acc_norm\": 0.6960893854748603,\n \"acc_norm_stderr\": 0.015382845587584517\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433263,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433263\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8456591639871383,\n \"acc_stderr\": 0.02051905034208471,\n \"acc_norm\": 0.8456591639871383,\n \"acc_norm_stderr\": 0.02051905034208471\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505405,\n \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505405\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6631205673758865,\n \"acc_stderr\": 0.02819553487396673,\n \"acc_norm\": 0.6631205673758865,\n \"acc_norm_stderr\": 0.02819553487396673\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6082138200782269,\n \"acc_stderr\": 0.012467564418145118,\n \"acc_norm\": 0.6082138200782269,\n \"acc_norm_stderr\": 0.012467564418145118\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.022368672562886747,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.022368672562886747\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273337,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273337\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n \"acc_stderr\": 0.020687186951534094,\n \"acc_norm\": 0.9054726368159204,\n \"acc_norm_stderr\": 0.020687186951534094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276894,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276894\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6597307221542228,\n \"mc1_stderr\": 0.016586304901762553,\n \"mc2\": 0.7778465654225306,\n \"mc2_stderr\": 0.013819882710780051\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7922668688400303,\n \"acc_stderr\": 0.011174572716705886\n }\n}\n```", "repo_url": "https://huggingface.co/SF-Foundation/Ein-72B-v0.12", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-06-19.237402.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["**/details_harness|winogrande|5_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T11-06-19.237402.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T11_06_19.237402", "path": ["results_2024-02-13T11-06-19.237402.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T11-06-19.237402.parquet"]}]}]} | 2024-02-13T11:08:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.12
Dataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.12 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T11:06:19.237402(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.12\n\n\n\nDataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.12 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:06:19.237402(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.12\n\n\n\nDataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.12 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:06:19.237402(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.12\n\n\n\nDataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.12 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T11:06:19.237402(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
2f1dc5e01ff74e04e51c6074b3c996cd6e35fc58 |
# Dataset Card for Evaluation run of saishf/Top-Western-Maid-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saishf/Top-Western-Maid-7B](https://huggingface.co/saishf/Top-Western-Maid-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saishf__Top-Western-Maid-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T11:07:35.441841](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Top-Western-Maid-7B/blob/main/results_2024-02-13T11-07-35.441841.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6506958953045886,
"acc_stderr": 0.03211223802352257,
"acc_norm": 0.6509594125199996,
"acc_norm_stderr": 0.032774521752530913,
"mc1": 0.42472460220318237,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.5879284610844989,
"mc2_stderr": 0.015340978033780782
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.6936860068259386,
"acc_norm_stderr": 0.013470584417276513
},
"harness|hellaswag|10": {
"acc": 0.6974706233817964,
"acc_stderr": 0.004584144014654942,
"acc_norm": 0.8740290778729337,
"acc_norm_stderr": 0.003311384498158642
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.02389187954195961,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.02389187954195961
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949832,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949832
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42472460220318237,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.5879284610844989,
"mc2_stderr": 0.015340978033780782
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.6595905989385898,
"acc_stderr": 0.013052097103299104
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_saishf__Top-Western-Maid-7B | [
"region:us"
] | 2024-02-13T11:09:57+00:00 | {"pretty_name": "Evaluation run of saishf/Top-Western-Maid-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [saishf/Top-Western-Maid-7B](https://huggingface.co/saishf/Top-Western-Maid-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saishf__Top-Western-Maid-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T11:07:35.441841](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Top-Western-Maid-7B/blob/main/results_2024-02-13T11-07-35.441841.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6506958953045886,\n \"acc_stderr\": 0.03211223802352257,\n \"acc_norm\": 0.6509594125199996,\n \"acc_norm_stderr\": 0.032774521752530913,\n \"mc1\": 0.42472460220318237,\n \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.5879284610844989,\n \"mc2_stderr\": 0.015340978033780782\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276513\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6974706233817964,\n \"acc_stderr\": 0.004584144014654942,\n \"acc_norm\": 0.8740290778729337,\n \"acc_norm_stderr\": 0.003311384498158642\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.012740853872949832,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.012740853872949832\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42472460220318237,\n \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.5879284610844989,\n \"mc2_stderr\": 0.015340978033780782\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6595905989385898,\n \"acc_stderr\": 0.013052097103299104\n }\n}\n```", "repo_url": "https://huggingface.co/saishf/Top-Western-Maid-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-07-35.441841.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["**/details_harness|winogrande|5_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T11-07-35.441841.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T11_07_35.441841", "path": ["results_2024-02-13T11-07-35.441841.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T11-07-35.441841.parquet"]}]}]} | 2024-02-13T11:10:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of saishf/Top-Western-Maid-7B
Dataset automatically created during the evaluation run of model saishf/Top-Western-Maid-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T11:07:35.441841(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of saishf/Top-Western-Maid-7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Top-Western-Maid-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:07:35.441841(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of saishf/Top-Western-Maid-7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Top-Western-Maid-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:07:35.441841(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of saishf/Top-Western-Maid-7B\n\n\n\nDataset automatically created during the evaluation run of model saishf/Top-Western-Maid-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T11:07:35.441841(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
a9f635bb820453dacca3260e1110e24705a096d3 |
This dataset created from falcon instruction dataset, I used facebook nllb-200-distilled-600M model to translate some of it from English language to Turkish. | umarigan/falcon_feedback_instraction_Turkish | [
"task_categories:question-answering",
"task_categories:conversational",
"task_categories:text2text-generation",
"size_categories:1K<n<10K",
"language:tr",
"region:us"
] | 2024-02-13T11:10:28+00:00 | {"language": ["tr"], "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "conversational", "text2text-generation"], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2872654, "num_examples": 3139}], "download_size": 1837026, "dataset_size": 2872654}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-13T11:14:27+00:00 | [] | [
"tr"
] | TAGS
#task_categories-question-answering #task_categories-conversational #task_categories-text2text-generation #size_categories-1K<n<10K #language-Turkish #region-us
|
This dataset created from falcon instruction dataset, I used facebook nllb-200-distilled-600M model to translate some of it from English language to Turkish. | [] | [
"TAGS\n#task_categories-question-answering #task_categories-conversational #task_categories-text2text-generation #size_categories-1K<n<10K #language-Turkish #region-us \n"
] | [
59
] | [
"passage: TAGS\n#task_categories-question-answering #task_categories-conversational #task_categories-text2text-generation #size_categories-1K<n<10K #language-Turkish #region-us \n"
] |
4473b3c0588a4608a906b27a5d01ae1e8d9591b7 |
# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.13
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SF-Foundation/Ein-72B-v0.13](https://huggingface.co/SF-Foundation/Ein-72B-v0.13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.13",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T11:08:38.011897](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.13/blob/main/results_2024-02-13T11-08-38.011897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.771145319640591,
"acc_stderr": 0.028040404393338988,
"acc_norm": 0.7726745322759058,
"acc_norm_stderr": 0.028602012805609787,
"mc1": 0.6585067319461444,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.7781799832544576,
"mc2_stderr": 0.0138330368770164
},
"harness|arc:challenge|25": {
"acc": 0.742320819112628,
"acc_stderr": 0.012780770562768416,
"acc_norm": 0.7619453924914675,
"acc_norm_stderr": 0.0124457700280262
},
"harness|hellaswag|10": {
"acc": 0.723859788886676,
"acc_stderr": 0.004461732908157662,
"acc_norm": 0.8944433379804819,
"acc_norm_stderr": 0.0030664137765701476
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8490566037735849,
"acc_stderr": 0.022032988985703494,
"acc_norm": 0.8490566037735849,
"acc_norm_stderr": 0.022032988985703494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9375,
"acc_stderr": 0.02024219611347799,
"acc_norm": 0.9375,
"acc_norm_stderr": 0.02024219611347799
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7655172413793103,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.7655172413793103,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6878306878306878,
"acc_stderr": 0.023865206836972592,
"acc_norm": 0.6878306878306878,
"acc_norm_stderr": 0.023865206836972592
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.01800360332586361,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.01800360332586361
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6600985221674877,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.6600985221674877,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.016999994927421592,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.016999994927421592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084315,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8,
"acc_stderr": 0.020280805062535722,
"acc_norm": 0.8,
"acc_norm_stderr": 0.020280805062535722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.44814814814814813,
"acc_stderr": 0.030321167196316275,
"acc_norm": 0.44814814814814813,
"acc_norm_stderr": 0.030321167196316275
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5629139072847682,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.5629139072847682,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9339449541284404,
"acc_stderr": 0.01064913148785894,
"acc_norm": 0.9339449541284404,
"acc_norm_stderr": 0.01064913148785894
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03179876342176853,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03179876342176853
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597453,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597453
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622793,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622793
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8404907975460123,
"acc_stderr": 0.02876748172598387,
"acc_norm": 0.8404907975460123,
"acc_norm_stderr": 0.02876748172598387
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9169859514687101,
"acc_stderr": 0.009866287394639536,
"acc_norm": 0.9169859514687101,
"acc_norm_stderr": 0.009866287394639536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.019685307033571946,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.019685307033571946
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6994413407821229,
"acc_stderr": 0.015334566806251164,
"acc_norm": 0.6994413407821229,
"acc_norm_stderr": 0.015334566806251164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.02046417512433263,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.02046417512433263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8392282958199357,
"acc_stderr": 0.020862388082391894,
"acc_norm": 0.8392282958199357,
"acc_norm_stderr": 0.020862388082391894
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.019061588181505405,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.019061588181505405
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6631205673758865,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.6631205673758865,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.60625814863103,
"acc_stderr": 0.012478532272564435,
"acc_norm": 0.60625814863103,
"acc_norm_stderr": 0.012478532272564435
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.022368672562886747,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.022368672562886747
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273337,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273337
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.02412746346265016,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.02412746346265016
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659397,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276894,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276894
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6585067319461444,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.7781799832544576,
"mc2_stderr": 0.0138330368770164
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479677
},
"harness|gsm8k|5": {
"acc": 0.7930250189537529,
"acc_stderr": 0.011159498164891769
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.13 | [
"region:us"
] | 2024-02-13T11:10:47+00:00 | {"pretty_name": "Evaluation run of SF-Foundation/Ein-72B-v0.13", "dataset_summary": "Dataset automatically created during the evaluation run of model [SF-Foundation/Ein-72B-v0.13](https://huggingface.co/SF-Foundation/Ein-72B-v0.13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.13\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T11:08:38.011897](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.13/blob/main/results_2024-02-13T11-08-38.011897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.771145319640591,\n \"acc_stderr\": 0.028040404393338988,\n \"acc_norm\": 0.7726745322759058,\n \"acc_norm_stderr\": 0.028602012805609787,\n \"mc1\": 0.6585067319461444,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.7781799832544576,\n \"mc2_stderr\": 0.0138330368770164\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.742320819112628,\n \"acc_stderr\": 0.012780770562768416,\n \"acc_norm\": 0.7619453924914675,\n \"acc_norm_stderr\": 0.0124457700280262\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.723859788886676,\n \"acc_stderr\": 0.004461732908157662,\n \"acc_norm\": 0.8944433379804819,\n \"acc_norm_stderr\": 0.0030664137765701476\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8490566037735849,\n \"acc_stderr\": 0.022032988985703494,\n \"acc_norm\": 0.8490566037735849,\n \"acc_norm_stderr\": 0.022032988985703494\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9375,\n \"acc_stderr\": 0.02024219611347799,\n \"acc_norm\": 0.9375,\n \"acc_norm_stderr\": 0.02024219611347799\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7655172413793103,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.7655172413793103,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6878306878306878,\n \"acc_stderr\": 0.023865206836972592,\n \"acc_norm\": 0.6878306878306878,\n \"acc_norm_stderr\": 0.023865206836972592\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8870967741935484,\n \"acc_stderr\": 0.01800360332586361,\n \"acc_norm\": 0.8870967741935484,\n \"acc_norm_stderr\": 0.01800360332586361\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.016999994927421592,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.016999994927421592\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084315,\n \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084315\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.44814814814814813,\n \"acc_stderr\": 0.030321167196316275,\n \"acc_norm\": 0.44814814814814813,\n \"acc_norm_stderr\": 0.030321167196316275\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9339449541284404,\n \"acc_stderr\": 0.01064913148785894,\n \"acc_norm\": 0.9339449541284404,\n \"acc_norm_stderr\": 0.01064913148785894\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03179876342176853,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03179876342176853\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597453\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8404907975460123,\n \"acc_stderr\": 0.02876748172598387,\n \"acc_norm\": 0.8404907975460123,\n \"acc_norm_stderr\": 0.02876748172598387\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9169859514687101,\n \"acc_stderr\": 0.009866287394639536,\n \"acc_norm\": 0.9169859514687101,\n \"acc_norm_stderr\": 0.009866287394639536\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6994413407821229,\n \"acc_stderr\": 0.015334566806251164,\n \"acc_norm\": 0.6994413407821229,\n \"acc_norm_stderr\": 0.015334566806251164\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433263,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433263\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8392282958199357,\n \"acc_stderr\": 0.020862388082391894,\n \"acc_norm\": 0.8392282958199357,\n \"acc_norm_stderr\": 0.020862388082391894\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505405,\n \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505405\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6631205673758865,\n \"acc_stderr\": 0.02819553487396673,\n \"acc_norm\": 0.6631205673758865,\n \"acc_norm_stderr\": 0.02819553487396673\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.60625814863103,\n \"acc_stderr\": 0.012478532272564435,\n \"acc_norm\": 0.60625814863103,\n \"acc_norm_stderr\": 0.012478532272564435\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.022368672562886747,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.022368672562886747\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273337,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273337\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.02412746346265016,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.02412746346265016\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659397,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276894,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276894\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6585067319461444,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.7781799832544576,\n \"mc2_stderr\": 0.0138330368770164\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479677\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7930250189537529,\n \"acc_stderr\": 0.011159498164891769\n }\n}\n```", "repo_url": "https://huggingface.co/SF-Foundation/Ein-72B-v0.13", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T11-08-38.011897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["**/details_harness|winogrande|5_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T11-08-38.011897.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T11_08_38.011897", "path": ["results_2024-02-13T11-08-38.011897.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T11-08-38.011897.parquet"]}]}]} | 2024-02-13T11:11:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.13
Dataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.13 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T11:08:38.011897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.13\n\n\n\nDataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.13 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:08:38.011897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.13\n\n\n\nDataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.13 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T11:08:38.011897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.13\n\n\n\nDataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.13 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T11:08:38.011897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
1b61051c192e9bc51ba875bd9f75c107733bc582 | # Dataset Card for Dataset Name
This dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.
## Dataset Details
### Dataset Description
The dataset comprises a comprehensive selection of topics, including but not limited to:
Frequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.
Inquiries pertaining to placements, encompassing strategies, tips, and common queries.
Questions related to fundamental concepts in Data Structures and Algorithms.
Queries and discussions regarding research papers, methodologies, and academic pursuits.
- **Curated by:** AI Team- IEEE SB VIT Pune
## Uses
This data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot
## Dataset Structure
The dataset consists of the following fields:
- **Instruction:** This field represents the prompt or query posed to the chatbot.
- **Response:** This field contains the corresponding generated response by the chatbot.
## Dataset Structure Information
The dataset is structured in a JSON format, with each entry containing the following fields:
```json
{
"instruction": "What is IEEE?",
"response": "The IEEE or Institute of Electrical and Electronics Engineers is the world's largest professional technical organization dedicated to the advancement of technology for the benefit of humanity."
}
[More Information Needed]
### Curation Rationale
The motivation behind curating this dataset stems from a genuine desire to empower and support university students pursuing B.Tech degrees. Recognizing the pivotal role that IEEE Student Branch at Vishwakarma Institute of Technology (VIT) Pune plays in students' academic journeys, the aim was to create a resource that elucidates the myriad ways in which IEEE SB VIT Pune can enrich and enhance students' educational experiences.
At its core, this dataset is a testament to the commitment of the AI team at IEEE SB VIT Pune to empower B.Tech students with valuable insights and resources. By curating a comprehensive collection of topics spanning FAQs, placement strategies, technical concepts, and research discussions, the dataset seeks to equip students with the knowledge and understanding necessary to navigate their academic pursuits effectively.
## Dataset Card Authors
AI Team- IEEE SB VIT Pune
Mrunmayee Phadke (Project Head)
Hritesh Maikap
Nidhish
Arya Lokhande
Apurva Kota
Soham Nimale
| hriteshMaikap/IEEEChatbotAplha | [
"region:us"
] | 2024-02-13T12:09:26+00:00 | {"dataset_info": {"features": [{"name": "Question", "dtype": "string"}, {"name": "Answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2101813, "num_examples": 5526}], "download_size": 821355, "dataset_size": 2101813}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T06:23:45+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for Dataset Name
This dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.
## Dataset Details
### Dataset Description
The dataset comprises a comprehensive selection of topics, including but not limited to:
Frequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.
Inquiries pertaining to placements, encompassing strategies, tips, and common queries.
Questions related to fundamental concepts in Data Structures and Algorithms.
Queries and discussions regarding research papers, methodologies, and academic pursuits.
- Curated by: AI Team- IEEE SB VIT Pune
## Uses
This data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot
## Dataset Structure
The dataset consists of the following fields:
- Instruction: This field represents the prompt or query posed to the chatbot.
- Response: This field contains the corresponding generated response by the chatbot.
## Dataset Structure Information
The dataset is structured in a JSON format, with each entry containing the following fields:
'''json
{
"instruction": "What is IEEE?",
"response": "The IEEE or Institute of Electrical and Electronics Engineers is the world's largest professional technical organization dedicated to the advancement of technology for the benefit of humanity."
}
### Curation Rationale
The motivation behind curating this dataset stems from a genuine desire to empower and support university students pursuing B.Tech degrees. Recognizing the pivotal role that IEEE Student Branch at Vishwakarma Institute of Technology (VIT) Pune plays in students' academic journeys, the aim was to create a resource that elucidates the myriad ways in which IEEE SB VIT Pune can enrich and enhance students' educational experiences.
At its core, this dataset is a testament to the commitment of the AI team at IEEE SB VIT Pune to empower B.Tech students with valuable insights and resources. By curating a comprehensive collection of topics spanning FAQs, placement strategies, technical concepts, and research discussions, the dataset seeks to equip students with the knowledge and understanding necessary to navigate their academic pursuits effectively.
## Dataset Card Authors
AI Team- IEEE SB VIT Pune
Mrunmayee Phadke (Project Head)
Hritesh Maikap
Nidhish
Arya Lokhande
Apurva Kota
Soham Nimale
| [
"# Dataset Card for Dataset Name\n\nThis dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.",
"## Dataset Details",
"### Dataset Description\nThe dataset comprises a comprehensive selection of topics, including but not limited to:\n\nFrequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.\nInquiries pertaining to placements, encompassing strategies, tips, and common queries.\nQuestions related to fundamental concepts in Data Structures and Algorithms.\nQueries and discussions regarding research papers, methodologies, and academic pursuits.\n\n- Curated by: AI Team- IEEE SB VIT Pune",
"## Uses\n\nThis data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot",
"## Dataset Structure\n\nThe dataset consists of the following fields:\n\n- Instruction: This field represents the prompt or query posed to the chatbot.\n- Response: This field contains the corresponding generated response by the chatbot.",
"## Dataset Structure Information\n\nThe dataset is structured in a JSON format, with each entry containing the following fields:\n\n'''json\n{\n \"instruction\": \"What is IEEE?\",\n \"response\": \"The IEEE or Institute of Electrical and Electronics Engineers is the world's largest professional technical organization dedicated to the advancement of technology for the benefit of humanity.\"\n}",
"### Curation Rationale\n\nThe motivation behind curating this dataset stems from a genuine desire to empower and support university students pursuing B.Tech degrees. Recognizing the pivotal role that IEEE Student Branch at Vishwakarma Institute of Technology (VIT) Pune plays in students' academic journeys, the aim was to create a resource that elucidates the myriad ways in which IEEE SB VIT Pune can enrich and enhance students' educational experiences.\nAt its core, this dataset is a testament to the commitment of the AI team at IEEE SB VIT Pune to empower B.Tech students with valuable insights and resources. By curating a comprehensive collection of topics spanning FAQs, placement strategies, technical concepts, and research discussions, the dataset seeks to equip students with the knowledge and understanding necessary to navigate their academic pursuits effectively.",
"## Dataset Card Authors\nAI Team- IEEE SB VIT Pune\nMrunmayee Phadke (Project Head)\nHritesh Maikap\nNidhish\nArya Lokhande\nApurva Kota\nSoham Nimale"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\nThis dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.",
"## Dataset Details",
"### Dataset Description\nThe dataset comprises a comprehensive selection of topics, including but not limited to:\n\nFrequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.\nInquiries pertaining to placements, encompassing strategies, tips, and common queries.\nQuestions related to fundamental concepts in Data Structures and Algorithms.\nQueries and discussions regarding research papers, methodologies, and academic pursuits.\n\n- Curated by: AI Team- IEEE SB VIT Pune",
"## Uses\n\nThis data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot",
"## Dataset Structure\n\nThe dataset consists of the following fields:\n\n- Instruction: This field represents the prompt or query posed to the chatbot.\n- Response: This field contains the corresponding generated response by the chatbot.",
"## Dataset Structure Information\n\nThe dataset is structured in a JSON format, with each entry containing the following fields:\n\n'''json\n{\n \"instruction\": \"What is IEEE?\",\n \"response\": \"The IEEE or Institute of Electrical and Electronics Engineers is the world's largest professional technical organization dedicated to the advancement of technology for the benefit of humanity.\"\n}",
"### Curation Rationale\n\nThe motivation behind curating this dataset stems from a genuine desire to empower and support university students pursuing B.Tech degrees. Recognizing the pivotal role that IEEE Student Branch at Vishwakarma Institute of Technology (VIT) Pune plays in students' academic journeys, the aim was to create a resource that elucidates the myriad ways in which IEEE SB VIT Pune can enrich and enhance students' educational experiences.\nAt its core, this dataset is a testament to the commitment of the AI team at IEEE SB VIT Pune to empower B.Tech students with valuable insights and resources. By curating a comprehensive collection of topics spanning FAQs, placement strategies, technical concepts, and research discussions, the dataset seeks to equip students with the knowledge and understanding necessary to navigate their academic pursuits effectively.",
"## Dataset Card Authors\nAI Team- IEEE SB VIT Pune\nMrunmayee Phadke (Project Head)\nHritesh Maikap\nNidhish\nArya Lokhande\nApurva Kota\nSoham Nimale"
] | [
6,
76,
4,
123,
69,
54,
88,
195,
47
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name\n\nThis dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.## Dataset Details### Dataset Description\nThe dataset comprises a comprehensive selection of topics, including but not limited to:\n\nFrequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.\nInquiries pertaining to placements, encompassing strategies, tips, and common queries.\nQuestions related to fundamental concepts in Data Structures and Algorithms.\nQueries and discussions regarding research papers, methodologies, and academic pursuits.\n\n- Curated by: AI Team- IEEE SB VIT Pune## Uses\n\nThis data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot## Dataset Structure\n\nThe dataset consists of the following fields:\n\n- Instruction: This field represents the prompt or query posed to the chatbot.\n- Response: This field contains the corresponding generated response by the chatbot.## Dataset Structure Information\n\nThe dataset is structured in a JSON format, with each entry containing the following fields:\n\n'''json\n{\n \"instruction\": \"What is IEEE?\",\n \"response\": \"The IEEE or Institute of Electrical and Electronics Engineers is the world's largest professional technical organization dedicated to the advancement of technology for the benefit of humanity.\"\n}"
] |
36aacdbb9a6d2d483b04f5637c57ab17828d7f82 |
# Dataset Card for Evaluation run of KnutJaegersberg/Deita-2b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deita-2b](https://huggingface.co/KnutJaegersberg/Deita-2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Deita-2b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T12:11:14.292713](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deita-2b/blob/main/results_2024-02-13T12-11-14.292713.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5258334013736883,
"acc_stderr": 0.034372132471159916,
"acc_norm": 0.529516305318977,
"acc_norm_stderr": 0.03507673065253058,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557978,
"mc2": 0.39613062371010777,
"mc2_stderr": 0.014153658398749146
},
"harness|arc:challenge|25": {
"acc": 0.40187713310580203,
"acc_stderr": 0.01432726861457828,
"acc_norm": 0.447098976109215,
"acc_norm_stderr": 0.014529380160526843
},
"harness|hellaswag|10": {
"acc": 0.517625970922127,
"acc_stderr": 0.004986680048438311,
"acc_norm": 0.7039434375622386,
"acc_norm_stderr": 0.0045558324627745905
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.030656748696739428,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.030656748696739428
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273956,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273956
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.0250437573185202,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.0250437573185202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.02757596072327824,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.02757596072327824
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.037563357751878974,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.037563357751878974
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.03394853965156402,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.03394853965156402
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969115,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6935779816513762,
"acc_stderr": 0.019765517220458523,
"acc_norm": 0.6935779816513762,
"acc_norm_stderr": 0.019765517220458523
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.0332057461294543,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.0332057461294543
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610798,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610798
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922758,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922758
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6602809706257982,
"acc_stderr": 0.01693639411430165,
"acc_norm": 0.6602809706257982,
"acc_norm_stderr": 0.01693639411430165
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098435,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098435
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.028074158947600653,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.028074158947600653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5895061728395061,
"acc_stderr": 0.027371350925124764,
"acc_norm": 0.5895061728395061,
"acc_norm_stderr": 0.027371350925124764
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39765319426336376,
"acc_stderr": 0.012499840347460643,
"acc_norm": 0.39765319426336376,
"acc_norm_stderr": 0.012499840347460643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004144,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004144
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.020220920829626923,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.020220920829626923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557978,
"mc2": 0.39613062371010777,
"mc2_stderr": 0.014153658398749146
},
"harness|winogrande|5": {
"acc": 0.6527229676400947,
"acc_stderr": 0.013380909249751233
},
"harness|gsm8k|5": {
"acc": 0.4131918119787718,
"acc_stderr": 0.013563326951984367
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Deita-2b | [
"region:us"
] | 2024-02-13T12:12:56+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Deita-2b", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deita-2b](https://huggingface.co/KnutJaegersberg/Deita-2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Deita-2b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T12:11:14.292713](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deita-2b/blob/main/results_2024-02-13T12-11-14.292713.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5258334013736883,\n \"acc_stderr\": 0.034372132471159916,\n \"acc_norm\": 0.529516305318977,\n \"acc_norm_stderr\": 0.03507673065253058,\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557978,\n \"mc2\": 0.39613062371010777,\n \"mc2_stderr\": 0.014153658398749146\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.40187713310580203,\n \"acc_stderr\": 0.01432726861457828,\n \"acc_norm\": 0.447098976109215,\n \"acc_norm_stderr\": 0.014529380160526843\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.517625970922127,\n \"acc_stderr\": 0.004986680048438311,\n \"acc_norm\": 0.7039434375622386,\n \"acc_norm_stderr\": 0.0045558324627745905\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.0403356566784832,\n \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.0403356566784832\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.030656748696739428,\n \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.030656748696739428\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.03809342081273956,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.03809342081273956\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n \"acc_stderr\": 0.02757596072327824,\n \"acc_norm\": 0.6225806451612903,\n \"acc_norm_stderr\": 0.02757596072327824\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.037563357751878974,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.037563357751878974\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6515151515151515,\n \"acc_stderr\": 0.03394853965156402,\n \"acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.03394853965156402\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.02534267129380725,\n \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.02534267129380725\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969115,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6935779816513762,\n \"acc_stderr\": 0.019765517220458523,\n \"acc_norm\": 0.6935779816513762,\n \"acc_norm_stderr\": 0.019765517220458523\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100999,\n \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100999\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.0332057461294543,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.0332057461294543\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610798,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610798\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n \"acc_stderr\": 0.026655699653922758,\n \"acc_norm\": 0.7905982905982906,\n \"acc_norm_stderr\": 0.026655699653922758\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6602809706257982,\n \"acc_stderr\": 0.01693639411430165,\n \"acc_norm\": 0.6602809706257982,\n \"acc_norm_stderr\": 0.01693639411430165\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n \"acc_stderr\": 0.014378169884098435,\n \"acc_norm\": 0.2446927374301676,\n \"acc_norm_stderr\": 0.014378169884098435\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.028074158947600653,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.028074158947600653\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.6109324758842444,\n \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5895061728395061,\n \"acc_stderr\": 0.027371350925124764,\n \"acc_norm\": 0.5895061728395061,\n \"acc_norm_stderr\": 0.027371350925124764\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39765319426336376,\n \"acc_stderr\": 0.012499840347460643,\n \"acc_norm\": 0.39765319426336376,\n \"acc_norm_stderr\": 0.012499840347460643\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004144,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004144\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.020220920829626923,\n \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.020220920829626923\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.0340105262010409,\n \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.0340105262010409\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557978,\n \"mc2\": 0.39613062371010777,\n \"mc2_stderr\": 0.014153658398749146\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6527229676400947,\n \"acc_stderr\": 0.013380909249751233\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4131918119787718,\n \"acc_stderr\": 0.013563326951984367\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Deita-2b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|arc:challenge|25_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|gsm8k|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hellaswag|10_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T12-11-14.292713.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["**/details_harness|winogrande|5_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T12-11-14.292713.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T12_11_14.292713", "path": ["results_2024-02-13T12-11-14.292713.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T12-11-14.292713.parquet"]}]}]} | 2024-02-13T12:13:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Deita-2b
Dataset automatically created during the evaluation run of model KnutJaegersberg/Deita-2b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T12:11:14.292713(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Deita-2b\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deita-2b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T12:11:14.292713(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Deita-2b\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deita-2b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T12:11:14.292713(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/Deita-2b\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deita-2b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T12:11:14.292713(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
630ffe02ae16b657616f7638d0a29efe3c5ed0cd |
# Dataset Card for Evaluation run of KnutJaegersberg/Deita-4b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deita-4b](https://huggingface.co/KnutJaegersberg/Deita-4b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Deita-4b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T12:13:56.610648](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deita-4b/blob/main/results_2024-02-13T12-13-56.610648.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5527501076415084,
"acc_stderr": 0.03417435776098714,
"acc_norm": 0.5557602717989104,
"acc_norm_stderr": 0.03486732917817813,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.5022562674091439,
"mc2_stderr": 0.014750233587042572
},
"harness|arc:challenge|25": {
"acc": 0.4300341296928328,
"acc_stderr": 0.01446763155913799,
"acc_norm": 0.46075085324232085,
"acc_norm_stderr": 0.014566303676636586
},
"harness|hellaswag|10": {
"acc": 0.5262895837482573,
"acc_stderr": 0.004982879340691411,
"acc_norm": 0.7180840470025891,
"acc_norm_stderr": 0.00449013069102043
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.04060127035236395,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.04060127035236395
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.02563425811555496,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.02563425811555496
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.032018671228777947,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.032018671228777947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.02531764972644866,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.02531764972644866
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5504201680672269,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.5504201680672269,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7192660550458716,
"acc_stderr": 0.019266055045871616,
"acc_norm": 0.7192660550458716,
"acc_norm_stderr": 0.019266055045871616
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.031980016601150706,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.031980016601150706
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906276,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906276
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543688,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7318007662835249,
"acc_stderr": 0.015842430835269438,
"acc_norm": 0.7318007662835249,
"acc_norm_stderr": 0.015842430835269438
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369918,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.02753007844711031,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.02753007844711031
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484624,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.027272582849839803,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.027272582849839803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.0294621892333706,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.0294621892333706
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40091264667535853,
"acc_stderr": 0.012516960350640828,
"acc_norm": 0.40091264667535853,
"acc_norm_stderr": 0.012516960350640828
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5506535947712419,
"acc_stderr": 0.020123766528027266,
"acc_norm": 0.5506535947712419,
"acc_norm_stderr": 0.020123766528027266
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505418,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505418
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.030555316755573637,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.030555316755573637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.5022562674091439,
"mc2_stderr": 0.014750233587042572
},
"harness|winogrande|5": {
"acc": 0.6614048934490924,
"acc_stderr": 0.01330016986584242
},
"harness|gsm8k|5": {
"acc": 0.4890068233510235,
"acc_stderr": 0.013769155509690904
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Deita-4b | [
"region:us"
] | 2024-02-13T12:16:01+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Deita-4b", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deita-4b](https://huggingface.co/KnutJaegersberg/Deita-4b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Deita-4b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T12:13:56.610648](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deita-4b/blob/main/results_2024-02-13T12-13-56.610648.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5527501076415084,\n \"acc_stderr\": 0.03417435776098714,\n \"acc_norm\": 0.5557602717989104,\n \"acc_norm_stderr\": 0.03486732917817813,\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.5022562674091439,\n \"mc2_stderr\": 0.014750233587042572\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4300341296928328,\n \"acc_stderr\": 0.01446763155913799,\n \"acc_norm\": 0.46075085324232085,\n \"acc_norm_stderr\": 0.014566303676636586\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5262895837482573,\n \"acc_stderr\": 0.004982879340691411,\n \"acc_norm\": 0.7180840470025891,\n \"acc_norm_stderr\": 0.00449013069102043\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.04060127035236395,\n \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.04060127035236395\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.02563425811555496,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.02563425811555496\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.032018671228777947,\n \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.032018671228777947\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.02531764972644866,\n \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.02531764972644866\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871616,\n \"acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871616\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.031980016601150706,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.031980016601150706\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906276,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906276\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543688,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543688\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7318007662835249,\n \"acc_stderr\": 0.015842430835269438,\n \"acc_norm\": 0.7318007662835249,\n \"acc_norm_stderr\": 0.015842430835269438\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n \"acc_stderr\": 0.014551553659369918,\n \"acc_norm\": 0.2536312849162011,\n \"acc_norm_stderr\": 0.014551553659369918\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.02753007844711031,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.02753007844711031\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n \"acc_stderr\": 0.027917050748484624,\n \"acc_norm\": 0.5916398713826366,\n \"acc_norm_stderr\": 0.027917050748484624\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839803,\n \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839803\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.0294621892333706,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.0294621892333706\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40091264667535853,\n \"acc_stderr\": 0.012516960350640828,\n \"acc_norm\": 0.40091264667535853,\n \"acc_norm_stderr\": 0.012516960350640828\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213528,\n \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213528\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5506535947712419,\n \"acc_stderr\": 0.020123766528027266,\n \"acc_norm\": 0.5506535947712419,\n \"acc_norm_stderr\": 0.020123766528027266\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.04582004841505418,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.04582004841505418\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.5022562674091439,\n \"mc2_stderr\": 0.014750233587042572\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6614048934490924,\n \"acc_stderr\": 0.01330016986584242\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4890068233510235,\n \"acc_stderr\": 0.013769155509690904\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Deita-4b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|arc:challenge|25_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|gsm8k|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hellaswag|10_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T12-13-56.610648.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["**/details_harness|winogrande|5_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T12-13-56.610648.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T12_13_56.610648", "path": ["results_2024-02-13T12-13-56.610648.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T12-13-56.610648.parquet"]}]}]} | 2024-02-13T12:16:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Deita-4b
Dataset automatically created during the evaluation run of model KnutJaegersberg/Deita-4b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T12:13:56.610648(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Deita-4b\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deita-4b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T12:13:56.610648(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Deita-4b\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deita-4b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T12:13:56.610648(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/Deita-4b\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deita-4b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T12:13:56.610648(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
f8c1c3a61ad8d4b85ec73037643ce37fd24fe2d5 | # Dataset Card for "input-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | rajivkale/dataset-webhook-testing | [
"region:us"
] | 2024-02-13T12:20:46+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "ADONIS", "1": "AFRICAN GIANT SWALLOWTAIL", "2": "AMERICAN SNOOT"}}}}], "splits": [{"name": "train", "num_bytes": 8825732.0, "num_examples": 338}], "download_size": 8823395, "dataset_size": 8825732.0}} | 2024-02-13T12:28:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "input-dataset"
More Information needed | [
"# Dataset Card for \"input-dataset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"input-dataset\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"input-dataset\"\n\nMore Information needed"
] |
309bf7efc0c5f94fb929cf7da1253211f372d29a |
# Dataset Card for Evaluation run of AbacusResearch/jaLLAbi
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AbacusResearch/jaLLAbi](https://huggingface.co/AbacusResearch/jaLLAbi) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AbacusResearch__jaLLAbi",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T12:32:57.211662](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__jaLLAbi/blob/main/results_2024-02-13T12-32-57.211662.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AbacusResearch__jaLLAbi | [
"region:us"
] | 2024-02-13T12:35:15+00:00 | {"pretty_name": "Evaluation run of AbacusResearch/jaLLAbi", "dataset_summary": "Dataset automatically created during the evaluation run of model [AbacusResearch/jaLLAbi](https://huggingface.co/AbacusResearch/jaLLAbi) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AbacusResearch__jaLLAbi\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T12:32:57.211662](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__jaLLAbi/blob/main/results_2024-02-13T12-32-57.211662.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/AbacusResearch/jaLLAbi", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|arc:challenge|25_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|gsm8k|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hellaswag|10_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T12-32-57.211662.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["**/details_harness|winogrande|5_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T12-32-57.211662.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T12_32_57.211662", "path": ["results_2024-02-13T12-32-57.211662.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T12-32-57.211662.parquet"]}]}]} | 2024-02-13T12:35:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AbacusResearch/jaLLAbi
Dataset automatically created during the evaluation run of model AbacusResearch/jaLLAbi on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T12:32:57.211662(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AbacusResearch/jaLLAbi\n\n\n\nDataset automatically created during the evaluation run of model AbacusResearch/jaLLAbi on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T12:32:57.211662(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AbacusResearch/jaLLAbi\n\n\n\nDataset automatically created during the evaluation run of model AbacusResearch/jaLLAbi on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T12:32:57.211662(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AbacusResearch/jaLLAbi\n\n\n\nDataset automatically created during the evaluation run of model AbacusResearch/jaLLAbi on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T12:32:57.211662(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
da994d9199b02a0bcf316a27e9766b507860565d |
# Dataset Card for Evaluation run of Technoculture/mtor-2x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/mtor-2x7b](https://huggingface.co/Technoculture/mtor-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__mtor-2x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T12:35:53.883707](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__mtor-2x7b/blob/main/results_2024-02-13T12-35-53.883707.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.513846349514935,
"acc_stderr": 0.034127487865330444,
"acc_norm": 0.5225503308269146,
"acc_norm_stderr": 0.03496907428627984,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.48059153955864553,
"mc2_stderr": 0.014969300928874024
},
"harness|arc:challenge|25": {
"acc": 0.5162116040955631,
"acc_stderr": 0.014603708567414947,
"acc_norm": 0.5520477815699659,
"acc_norm_stderr": 0.014532011498211678
},
"harness|hellaswag|10": {
"acc": 0.543218482374029,
"acc_stderr": 0.00497110626504655,
"acc_norm": 0.7360087631945827,
"acc_norm_stderr": 0.004398937225038412
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.041711158581816184,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.041711158581816184
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709390974,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709390974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374766,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374766
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5967741935483871,
"acc_stderr": 0.027906150826041146,
"acc_norm": 0.5967741935483871,
"acc_norm_stderr": 0.027906150826041146
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.03394853965156402,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.03394853965156402
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7461139896373057,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.7461139896373057,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7045871559633028,
"acc_stderr": 0.019560619182976,
"acc_norm": 0.7045871559633028,
"acc_norm_stderr": 0.019560619182976
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5521472392638037,
"acc_stderr": 0.03906947479456607,
"acc_norm": 0.5521472392638037,
"acc_norm_stderr": 0.03906947479456607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.027601921381417604,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.027601921381417604
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6934865900383141,
"acc_stderr": 0.01648695289304151,
"acc_norm": 0.6934865900383141,
"acc_norm_stderr": 0.01648695289304151
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527817,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527817
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5627009646302251,
"acc_stderr": 0.0281739177617629,
"acc_norm": 0.5627009646302251,
"acc_norm_stderr": 0.0281739177617629
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5802469135802469,
"acc_stderr": 0.027460099557005138,
"acc_norm": 0.5802469135802469,
"acc_norm_stderr": 0.027460099557005138
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3474576271186441,
"acc_stderr": 0.0121614177297498,
"acc_norm": 0.3474576271186441,
"acc_norm_stderr": 0.0121614177297498
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02021703065318646,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02021703065318646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789848,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789848
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.48059153955864553,
"mc2_stderr": 0.014969300928874024
},
"harness|winogrande|5": {
"acc": 0.7063930544593529,
"acc_stderr": 0.012799397296204164
},
"harness|gsm8k|5": {
"acc": 0.03639120545868082,
"acc_stderr": 0.005158113489231194
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Technoculture__mtor-2x7b | [
"region:us"
] | 2024-02-13T12:37:42+00:00 | {"pretty_name": "Evaluation run of Technoculture/mtor-2x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Technoculture/mtor-2x7b](https://huggingface.co/Technoculture/mtor-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__mtor-2x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T12:35:53.883707](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__mtor-2x7b/blob/main/results_2024-02-13T12-35-53.883707.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.513846349514935,\n \"acc_stderr\": 0.034127487865330444,\n \"acc_norm\": 0.5225503308269146,\n \"acc_norm_stderr\": 0.03496907428627984,\n \"mc1\": 0.3108935128518972,\n \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.48059153955864553,\n \"mc2_stderr\": 0.014969300928874024\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5162116040955631,\n \"acc_stderr\": 0.014603708567414947,\n \"acc_norm\": 0.5520477815699659,\n \"acc_norm_stderr\": 0.014532011498211678\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.543218482374029,\n \"acc_stderr\": 0.00497110626504655,\n \"acc_norm\": 0.7360087631945827,\n \"acc_norm_stderr\": 0.004398937225038412\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n \"acc_stderr\": 0.041711158581816184,\n \"acc_norm\": 0.5347222222222222,\n \"acc_norm_stderr\": 0.041711158581816184\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709390974,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709390974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374766,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374766\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5967741935483871,\n \"acc_stderr\": 0.027906150826041146,\n \"acc_norm\": 0.5967741935483871,\n \"acc_norm_stderr\": 0.027906150826041146\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6515151515151515,\n \"acc_stderr\": 0.03394853965156402,\n \"acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.03394853965156402\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7461139896373057,\n \"acc_stderr\": 0.03141024780565319,\n \"acc_norm\": 0.7461139896373057,\n \"acc_norm_stderr\": 0.03141024780565319\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7045871559633028,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\": 0.7045871559633028,\n \"acc_norm_stderr\": 0.019560619182976\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5521472392638037,\n \"acc_stderr\": 0.03906947479456607,\n \"acc_norm\": 0.5521472392638037,\n \"acc_norm_stderr\": 0.03906947479456607\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n \"acc_stderr\": 0.027601921381417604,\n \"acc_norm\": 0.7692307692307693,\n \"acc_norm_stderr\": 0.027601921381417604\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6934865900383141,\n \"acc_stderr\": 0.01648695289304151,\n \"acc_norm\": 0.6934865900383141,\n \"acc_norm_stderr\": 0.01648695289304151\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098177,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098177\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n \"acc_stderr\": 0.014635185616527817,\n \"acc_norm\": 0.2581005586592179,\n \"acc_norm_stderr\": 0.014635185616527817\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5627009646302251,\n \"acc_stderr\": 0.0281739177617629,\n \"acc_norm\": 0.5627009646302251,\n \"acc_norm_stderr\": 0.0281739177617629\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.027460099557005138,\n \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.027460099557005138\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3474576271186441,\n \"acc_stderr\": 0.0121614177297498,\n \"acc_norm\": 0.3474576271186441,\n \"acc_norm_stderr\": 0.0121614177297498\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02021703065318646,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02021703065318646\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.48059153955864553,\n \"mc2_stderr\": 0.014969300928874024\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7063930544593529,\n \"acc_stderr\": 0.012799397296204164\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03639120545868082,\n \"acc_stderr\": 0.005158113489231194\n }\n}\n```", "repo_url": "https://huggingface.co/Technoculture/mtor-2x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|arc:challenge|25_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|gsm8k|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hellaswag|10_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T12-35-53.883707.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["**/details_harness|winogrande|5_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T12-35-53.883707.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T12_35_53.883707", "path": ["results_2024-02-13T12-35-53.883707.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T12-35-53.883707.parquet"]}]}]} | 2024-02-13T12:38:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Technoculture/mtor-2x7b
Dataset automatically created during the evaluation run of model Technoculture/mtor-2x7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T12:35:53.883707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Technoculture/mtor-2x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/mtor-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T12:35:53.883707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Technoculture/mtor-2x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/mtor-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T12:35:53.883707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Technoculture/mtor-2x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/mtor-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T12:35:53.883707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
13b7c6801089f8c48cf78492fb51d7cfd344ff9a | # Liveness Detection - Video Classification
The biometric attack dataset with **replay attacks** on the real videos of people. **Replay attack** involves presenting a pre-recorded video or previously captured footage as if it were occurring in real-time.
The primary objective is to distinguish between genuine, real-time footage and manipulated recordings.
The videos were gathered by capturing faces of genuine individuals presenting spoofs, using facial presentations. Our dataset proposes a novel approach that learns and detects spoofing techniques, extracting features from the genuine facial images to prevent the capturing of such information by fake users.
The dataset contains videos of real humans with various **resolutions, views, and colors**, making it a comprehensive resource for researchers working on anti-spoofing technologies.

The dataset provides data to combine and apply different techniques, approaches, and models to address the challenging task of distinguishing between genuine and spoofed inputs, providing effective anti-spoofing solutions in active authentication systems. These solutions are crucial as newer devices, such as phones, have become vulnerable to spoofing attacks due to the availability of technologies that can create replays, reflections, and depths, making them susceptible to spoofing and generalization.
### People in the dataset

Our dataset also explores the use of neural architectures, such as deep neural networks, to facilitate the identification of distinguishing patterns and textures in different regions of the face, increasing the accuracy and generalizability of the anti-spoofing models.
# 💴 For Commercial Usage: Full version of the dataset includes 51 000+ videos, leave a request on **[TrainingData](https://trainingdata.pro/data-market/anti-spoofing-replay-attack?utm_source=huggingface&utm_medium=cpc&utm_campaign=display-spoof)** to buy the dataset
### Metadata for the full dataset:
- **replay.assignment_id** - unique identifier of the media file
- **real_assignment_id**- unique identifier of the media file from the [Antispoofing Real Dataset](https://trainingdata.pro/data-market/antispoofing-real?utm_source=kaggle&utm_medium=cpc&utm_campaign=antispoofing-replay-dataset)
- **worker_id** - unique identifier of the person
- **age** - age of the person
- **true_gender** - gender of the person
- **country** - country of the person
- **ethnicity** - ethnicity of the person
- **video_extension** - video extensions in the dataset
- **video_resolution** - video resolution in the dataset
- **video_duration** - video duration in the dataset
- **video_fps** - frames per second for video in the dataset
# 💴 Buy the Dataset: This is just an example of the data. Leave a request on **[https://trainingdata.pro/data-market](https://trainingdata.pro/data-market/anti-spoofing-replay-attack?utm_source=huggingface&utm_medium=cpc&utm_campaign=display-spoof) to learn about the price and buy the dataset**
# Content
The dataset includes **files** folder with videos of people
### File with the extension .csv
- **id**: id of the person,
- **file**: link to access the display spoof attack video
## **[TrainingData](https://trainingdata.pro/data-market/anti-spoofing-replay-attack?utm_source=huggingface&utm_medium=cpc&utm_campaign=display-spoof)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **<https://www.kaggle.com/trainingdatapro/datasets>**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets**
*keywords: liveness detection systems, liveness detection dataset, biometric dataset, biometric data dataset, biometric system attacks, anti-spoofing dataset, face liveness detection, deep learning dataset, face spoofing database, face anti-spoofing, ibeta dataset, face anti spoofing, large-scale face anti spoofing, rich annotations anti spoofing dataset* | TrainingDataPro/display-spoof-attack | [
"task_categories:video-classification",
"task_categories:image-to-video",
"language:en",
"license:cc-by-nc-nd-4.0",
"code",
"finance",
"legal",
"region:us"
] | 2024-02-13T12:45:56+00:00 | {"language": ["en"], "license": "cc-by-nc-nd-4.0", "task_categories": ["video-classification", "image-to-video"], "tags": ["code", "finance", "legal"]} | 2024-02-13T12:53:32+00:00 | [] | [
"en"
] | TAGS
#task_categories-video-classification #task_categories-image-to-video #language-English #license-cc-by-nc-nd-4.0 #code #finance #legal #region-us
| # Liveness Detection - Video Classification
The biometric attack dataset with replay attacks on the real videos of people. Replay attack involves presenting a pre-recorded video or previously captured footage as if it were occurring in real-time.
The primary objective is to distinguish between genuine, real-time footage and manipulated recordings.
The videos were gathered by capturing faces of genuine individuals presenting spoofs, using facial presentations. Our dataset proposes a novel approach that learns and detects spoofing techniques, extracting features from the genuine facial images to prevent the capturing of such information by fake users.
The dataset contains videos of real humans with various resolutions, views, and colors, making it a comprehensive resource for researchers working on anti-spoofing technologies.
 magyar utasításokra szűrve. | boapps/aya_dataset_hu | [
"language:hu",
"region:us"
] | 2024-02-13T12:47:44+00:00 | {"language": ["hu"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "language_code", "dtype": "string"}, {"name": "annotation_type", "dtype": "string"}, {"name": "user_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 123343.88652131803, "num_examples": 98}, {"name": "test", "num_bytes": 0, "num_examples": 0}], "download_size": 40444, "dataset_size": 123343.88652131803}} | 2024-02-13T12:54:03+00:00 | [] | [
"hu"
] | TAGS
#language-Hungarian #region-us
| # aya_dataset_hu
Az aya adathalmaz magyar utasításokra szűrve. | [
"# aya_dataset_hu\n\nAz aya adathalmaz magyar utasításokra szűrve."
] | [
"TAGS\n#language-Hungarian #region-us \n",
"# aya_dataset_hu\n\nAz aya adathalmaz magyar utasításokra szűrve."
] | [
12,
19
] | [
"passage: TAGS\n#language-Hungarian #region-us \n# aya_dataset_hu\n\nAz aya adathalmaz magyar utasításokra szűrve."
] |
9767bde1e557d1aef9bb70808ce5642493c11574 | # CryptoData Dataset
The CryptoData dataset is a comprehensive collection of cryptocurrency market data, designed to support various analyses, including price prediction, market trend analysis, and the study of the impact of various indicators on cryptocurrency prices.
This dataset has been configured to provide flexibility in selecting specific types of market data through the use of different dataset configurations. Depending on the analysis needs, users can select one of the available configurations to load data tailored to their requirements.
## Available Configurations:
1. **Default**: Includes open, high, low, close, and volume for each cryptocurrency market and date.
2. **Close**: Focuses on the close price and volume of each cryptocurrency market and date, optimized for simplicity and analyses centered on closing prices.
3. **Indicators**: Expands upon the default configuration by including technical indicators such as RSI (Relative Strength Index), SMA (Simple Moving Average), and EMA (Exponential Moving Average), aimed at more advanced technical analyses.
4. **Sequences**: Specifically designed for sequence prediction tasks, this configuration provides sequences of market data alongside the corresponding prediction targets, facilitating the development of models for future price prediction.
## How to Use:
Below are Python code snippets demonstrating how to load the CryptoData dataset with each configuration. Before running the snippets, ensure you have the `datasets` library from Hugging Face installed.
```python
from datasets import load_dataset
# Load the default configuration
dataset_default = load_dataset("crypto_data", config_name="default")
# Load the 'close' configuration
dataset_close = load_dataset("crypto_data", config_name="close")
# Load the 'indicators' configuration
dataset_indicators = load_dataset("crypto_data", config_name="indicators")
# Load the 'sequences' configuration
dataset_sequences = load_dataset("crypto_data", config_name="sequences")
```
## Dataset Structure:
- `market`: The cryptocurrency market (e.g., "BTC-USD").
- `date`/`time`: The date or time of the data point.
- `open`, `high`, `low`, `close`: Open, high, low, and close prices for the cryptocurrency.
- `volume`: The volume of transactions.
- `rsi`, `sma`, `ema`: Technical indicators including Relative Strength Index, Simple Moving Average, and Exponential Moving Average (available in the `indicators` configuration).
- `sequence`, `prediction`: Arrays of historical data and the corresponding future data to predict (available in the `sequences` configuration).
## Important Notes:
- This dataset is for academic and research purposes only. Ensure compliance with any usage restrictions set by the data provider.
- When using technical indicators in your analysis, be aware that these indicators alone may not provide a complete picture of market dynamics.
- The sequences configuration requires significant preprocessing, including the calculation of technical indicators and the formation of sequences. This configuration is best suited for those with experience in time series analysis and deep learning.
## Citation and Acknowledgments:
This dataset is made available for public use by the cryptocurrency research community. While there is no specific citation for this dataset, users are encouraged to reference the dataset's URL and the corresponding author's contributions.
Homepage: [CryptoData Dataset on Hugging Face](https://hub.huggingface.co/datasets/crypto_data)
For any questions or issues with the dataset, please raise an issue on the repository hosting the dataset.
| sebdg/crypto_data | [
"task_categories:time-series-forecasting",
"multilinguality:monolingual",
"language:en",
"license:apache-2.0",
"finance",
"crypto",
"economics",
"trading",
"blockchain",
"quantitative-analysis",
"machine-learning",
"deep-learning",
"time-series",
"sequence-modeling",
"price-prediction",
"market-analysis",
"investment-strategies",
"technical-indicators",
"historical-data-analysis",
"region:us"
] | 2024-02-13T13:08:05+00:00 | {"language": ["en"], "license": "apache-2.0", "multilinguality": ["monolingual"], "task_categories": ["time-series-forecasting"], "pretty_name": "CryptoData Dataset", "tags": ["finance", "crypto", "economics", "trading", "blockchain", "quantitative-analysis", "machine-learning", "deep-learning", "time-series", "sequence-modeling", "price-prediction", "market-analysis", "investment-strategies", "technical-indicators", "historical-data-analysis"]} | 2024-02-16T12:18:45+00:00 | [] | [
"en"
] | TAGS
#task_categories-time-series-forecasting #multilinguality-monolingual #language-English #license-apache-2.0 #finance #crypto #economics #trading #blockchain #quantitative-analysis #machine-learning #deep-learning #time-series #sequence-modeling #price-prediction #market-analysis #investment-strategies #technical-indicators #historical-data-analysis #region-us
| # CryptoData Dataset
The CryptoData dataset is a comprehensive collection of cryptocurrency market data, designed to support various analyses, including price prediction, market trend analysis, and the study of the impact of various indicators on cryptocurrency prices.
This dataset has been configured to provide flexibility in selecting specific types of market data through the use of different dataset configurations. Depending on the analysis needs, users can select one of the available configurations to load data tailored to their requirements.
## Available Configurations:
1. Default: Includes open, high, low, close, and volume for each cryptocurrency market and date.
2. Close: Focuses on the close price and volume of each cryptocurrency market and date, optimized for simplicity and analyses centered on closing prices.
3. Indicators: Expands upon the default configuration by including technical indicators such as RSI (Relative Strength Index), SMA (Simple Moving Average), and EMA (Exponential Moving Average), aimed at more advanced technical analyses.
4. Sequences: Specifically designed for sequence prediction tasks, this configuration provides sequences of market data alongside the corresponding prediction targets, facilitating the development of models for future price prediction.
## How to Use:
Below are Python code snippets demonstrating how to load the CryptoData dataset with each configuration. Before running the snippets, ensure you have the 'datasets' library from Hugging Face installed.
## Dataset Structure:
- 'market': The cryptocurrency market (e.g., "BTC-USD").
- 'date'/'time': The date or time of the data point.
- 'open', 'high', 'low', 'close': Open, high, low, and close prices for the cryptocurrency.
- 'volume': The volume of transactions.
- 'rsi', 'sma', 'ema': Technical indicators including Relative Strength Index, Simple Moving Average, and Exponential Moving Average (available in the 'indicators' configuration).
- 'sequence', 'prediction': Arrays of historical data and the corresponding future data to predict (available in the 'sequences' configuration).
## Important Notes:
- This dataset is for academic and research purposes only. Ensure compliance with any usage restrictions set by the data provider.
- When using technical indicators in your analysis, be aware that these indicators alone may not provide a complete picture of market dynamics.
- The sequences configuration requires significant preprocessing, including the calculation of technical indicators and the formation of sequences. This configuration is best suited for those with experience in time series analysis and deep learning.
and Acknowledgments:
This dataset is made available for public use by the cryptocurrency research community. While there is no specific citation for this dataset, users are encouraged to reference the dataset's URL and the corresponding author's contributions.
Homepage: CryptoData Dataset on Hugging Face
For any questions or issues with the dataset, please raise an issue on the repository hosting the dataset.
| [
"# CryptoData Dataset\n\nThe CryptoData dataset is a comprehensive collection of cryptocurrency market data, designed to support various analyses, including price prediction, market trend analysis, and the study of the impact of various indicators on cryptocurrency prices.\n\nThis dataset has been configured to provide flexibility in selecting specific types of market data through the use of different dataset configurations. Depending on the analysis needs, users can select one of the available configurations to load data tailored to their requirements.",
"## Available Configurations:\n\n1. Default: Includes open, high, low, close, and volume for each cryptocurrency market and date.\n\n2. Close: Focuses on the close price and volume of each cryptocurrency market and date, optimized for simplicity and analyses centered on closing prices.\n\n3. Indicators: Expands upon the default configuration by including technical indicators such as RSI (Relative Strength Index), SMA (Simple Moving Average), and EMA (Exponential Moving Average), aimed at more advanced technical analyses.\n\n4. Sequences: Specifically designed for sequence prediction tasks, this configuration provides sequences of market data alongside the corresponding prediction targets, facilitating the development of models for future price prediction.",
"## How to Use:\n\nBelow are Python code snippets demonstrating how to load the CryptoData dataset with each configuration. Before running the snippets, ensure you have the 'datasets' library from Hugging Face installed.",
"## Dataset Structure:\n\n- 'market': The cryptocurrency market (e.g., \"BTC-USD\").\n- 'date'/'time': The date or time of the data point.\n- 'open', 'high', 'low', 'close': Open, high, low, and close prices for the cryptocurrency.\n- 'volume': The volume of transactions.\n- 'rsi', 'sma', 'ema': Technical indicators including Relative Strength Index, Simple Moving Average, and Exponential Moving Average (available in the 'indicators' configuration).\n- 'sequence', 'prediction': Arrays of historical data and the corresponding future data to predict (available in the 'sequences' configuration).",
"## Important Notes:\n\n- This dataset is for academic and research purposes only. Ensure compliance with any usage restrictions set by the data provider.\n- When using technical indicators in your analysis, be aware that these indicators alone may not provide a complete picture of market dynamics.\n- The sequences configuration requires significant preprocessing, including the calculation of technical indicators and the formation of sequences. This configuration is best suited for those with experience in time series analysis and deep learning.\n\nand Acknowledgments:\n\nThis dataset is made available for public use by the cryptocurrency research community. While there is no specific citation for this dataset, users are encouraged to reference the dataset's URL and the corresponding author's contributions.\n\nHomepage: CryptoData Dataset on Hugging Face\n\nFor any questions or issues with the dataset, please raise an issue on the repository hosting the dataset."
] | [
"TAGS\n#task_categories-time-series-forecasting #multilinguality-monolingual #language-English #license-apache-2.0 #finance #crypto #economics #trading #blockchain #quantitative-analysis #machine-learning #deep-learning #time-series #sequence-modeling #price-prediction #market-analysis #investment-strategies #technical-indicators #historical-data-analysis #region-us \n",
"# CryptoData Dataset\n\nThe CryptoData dataset is a comprehensive collection of cryptocurrency market data, designed to support various analyses, including price prediction, market trend analysis, and the study of the impact of various indicators on cryptocurrency prices.\n\nThis dataset has been configured to provide flexibility in selecting specific types of market data through the use of different dataset configurations. Depending on the analysis needs, users can select one of the available configurations to load data tailored to their requirements.",
"## Available Configurations:\n\n1. Default: Includes open, high, low, close, and volume for each cryptocurrency market and date.\n\n2. Close: Focuses on the close price and volume of each cryptocurrency market and date, optimized for simplicity and analyses centered on closing prices.\n\n3. Indicators: Expands upon the default configuration by including technical indicators such as RSI (Relative Strength Index), SMA (Simple Moving Average), and EMA (Exponential Moving Average), aimed at more advanced technical analyses.\n\n4. Sequences: Specifically designed for sequence prediction tasks, this configuration provides sequences of market data alongside the corresponding prediction targets, facilitating the development of models for future price prediction.",
"## How to Use:\n\nBelow are Python code snippets demonstrating how to load the CryptoData dataset with each configuration. Before running the snippets, ensure you have the 'datasets' library from Hugging Face installed.",
"## Dataset Structure:\n\n- 'market': The cryptocurrency market (e.g., \"BTC-USD\").\n- 'date'/'time': The date or time of the data point.\n- 'open', 'high', 'low', 'close': Open, high, low, and close prices for the cryptocurrency.\n- 'volume': The volume of transactions.\n- 'rsi', 'sma', 'ema': Technical indicators including Relative Strength Index, Simple Moving Average, and Exponential Moving Average (available in the 'indicators' configuration).\n- 'sequence', 'prediction': Arrays of historical data and the corresponding future data to predict (available in the 'sequences' configuration).",
"## Important Notes:\n\n- This dataset is for academic and research purposes only. Ensure compliance with any usage restrictions set by the data provider.\n- When using technical indicators in your analysis, be aware that these indicators alone may not provide a complete picture of market dynamics.\n- The sequences configuration requires significant preprocessing, including the calculation of technical indicators and the formation of sequences. This configuration is best suited for those with experience in time series analysis and deep learning.\n\nand Acknowledgments:\n\nThis dataset is made available for public use by the cryptocurrency research community. While there is no specific citation for this dataset, users are encouraged to reference the dataset's URL and the corresponding author's contributions.\n\nHomepage: CryptoData Dataset on Hugging Face\n\nFor any questions or issues with the dataset, please raise an issue on the repository hosting the dataset."
] | [
115,
104,
171,
54,
177,
196
] | [
"passage: TAGS\n#task_categories-time-series-forecasting #multilinguality-monolingual #language-English #license-apache-2.0 #finance #crypto #economics #trading #blockchain #quantitative-analysis #machine-learning #deep-learning #time-series #sequence-modeling #price-prediction #market-analysis #investment-strategies #technical-indicators #historical-data-analysis #region-us \n# CryptoData Dataset\n\nThe CryptoData dataset is a comprehensive collection of cryptocurrency market data, designed to support various analyses, including price prediction, market trend analysis, and the study of the impact of various indicators on cryptocurrency prices.\n\nThis dataset has been configured to provide flexibility in selecting specific types of market data through the use of different dataset configurations. Depending on the analysis needs, users can select one of the available configurations to load data tailored to their requirements.## Available Configurations:\n\n1. Default: Includes open, high, low, close, and volume for each cryptocurrency market and date.\n\n2. Close: Focuses on the close price and volume of each cryptocurrency market and date, optimized for simplicity and analyses centered on closing prices.\n\n3. Indicators: Expands upon the default configuration by including technical indicators such as RSI (Relative Strength Index), SMA (Simple Moving Average), and EMA (Exponential Moving Average), aimed at more advanced technical analyses.\n\n4. Sequences: Specifically designed for sequence prediction tasks, this configuration provides sequences of market data alongside the corresponding prediction targets, facilitating the development of models for future price prediction.## How to Use:\n\nBelow are Python code snippets demonstrating how to load the CryptoData dataset with each configuration. Before running the snippets, ensure you have the 'datasets' library from Hugging Face installed."
] |
15a162d39101fee769652e781c85836ce5411d89 |
I wrote the following script to scrape the data from a platform that sells second-hand apparel and clothing.
[Github | Customer Reviews for Second Hand Apparels](https://github.com/ChaoticQubit/scrape_me_of_my_data/tree/main/Nuuly-Second-Hand-Apparels-Customer_reviews)
The reviews are available for only about 3500 products from the links.txt file. This is only 20% of the total available products. Feel free to clone the script yourself and scrape on your system if you need more data.
reviews.json file contains the data you can use for learning or research purposes. This file contains customer reviews in JSON format for about 3500 products that were sold as second-hand apparel. You can use this data to understand customer perspectives on second-hand clothing and to analyze its limitations and opportunities. | ChaoticQubit/Customer_Reviews-Second_Hand_Apparels | [
"task_categories:text-classification",
"task_categories:question-answering",
"task_categories:summarization",
"task_categories:feature-extraction",
"task_categories:conversational",
"task_categories:sentence-similarity",
"size_categories:10K<n<100K",
"language:en",
"Customer Reviews",
"Second-Hand",
"Product Reviews",
"Comments",
"Text Data",
"region:us"
] | 2024-02-13T13:13:33+00:00 | {"language": ["en"], "size_categories": ["10K<n<100K"], "task_categories": ["text-classification", "question-answering", "summarization", "feature-extraction", "conversational", "sentence-similarity"], "tags": ["Customer Reviews", "Second-Hand", "Product Reviews", "Comments", "Text Data"]} | 2024-02-13T13:31:32+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #task_categories-question-answering #task_categories-summarization #task_categories-feature-extraction #task_categories-conversational #task_categories-sentence-similarity #size_categories-10K<n<100K #language-English #Customer Reviews #Second-Hand #Product Reviews #Comments #Text Data #region-us
|
I wrote the following script to scrape the data from a platform that sells second-hand apparel and clothing.
Github | Customer Reviews for Second Hand Apparels
The reviews are available for only about 3500 products from the URL file. This is only 20% of the total available products. Feel free to clone the script yourself and scrape on your system if you need more data.
URL file contains the data you can use for learning or research purposes. This file contains customer reviews in JSON format for about 3500 products that were sold as second-hand apparel. You can use this data to understand customer perspectives on second-hand clothing and to analyze its limitations and opportunities. | [] | [
"TAGS\n#task_categories-text-classification #task_categories-question-answering #task_categories-summarization #task_categories-feature-extraction #task_categories-conversational #task_categories-sentence-similarity #size_categories-10K<n<100K #language-English #Customer Reviews #Second-Hand #Product Reviews #Comments #Text Data #region-us \n"
] | [
112
] | [
"passage: TAGS\n#task_categories-text-classification #task_categories-question-answering #task_categories-summarization #task_categories-feature-extraction #task_categories-conversational #task_categories-sentence-similarity #size_categories-10K<n<100K #language-English #Customer Reviews #Second-Hand #Product Reviews #Comments #Text Data #region-us \n"
] |
791c962da3c1677bb7cbc73b38aca60d82610688 |
# Filtered CohereForAI/aya_dataset on zsm language
Originally from https://huggingface.co/datasets/CohereForAI/aya_dataset, filter rows on `zsm` language only. | malaysia-ai/filtered-aya-dataset-zsm | [
"task_categories:question-answering",
"language:ms",
"region:us"
] | 2024-02-13T13:29:35+00:00 | {"language": ["ms"], "task_categories": ["question-answering"]} | 2024-02-13T13:31:10+00:00 | [] | [
"ms"
] | TAGS
#task_categories-question-answering #language-Malay (macrolanguage) #region-us
|
# Filtered CohereForAI/aya_dataset on zsm language
Originally from URL filter rows on 'zsm' language only. | [
"# Filtered CohereForAI/aya_dataset on zsm language\n\nOriginally from URL filter rows on 'zsm' language only."
] | [
"TAGS\n#task_categories-question-answering #language-Malay (macrolanguage) #region-us \n",
"# Filtered CohereForAI/aya_dataset on zsm language\n\nOriginally from URL filter rows on 'zsm' language only."
] | [
28,
34
] | [
"passage: TAGS\n#task_categories-question-answering #language-Malay (macrolanguage) #region-us \n# Filtered CohereForAI/aya_dataset on zsm language\n\nOriginally from URL filter rows on 'zsm' language only."
] |
9cc2952e111b55d333fc21573f59ecedde343b6f |
The SCHAEFFER dataset (Spectro-morphogical Corpus of Human-annotated Audio with Electroacoustic Features for Experimental Research), is a compilation of 788 raw audio data accompanied by human annotations and morphological acoustic features.
The audio files adhere to the concept of Sound Objects introduced by Pierre Scaheffer, a framework for the analysis and creation of sound that focuses on its typological and morphological characteristics.
Inside the dataset, the annotation are provided in the form of free text, while the labels are pre-chosen from a list of classes, making the sound description fit into a suitable framework for digital analysis.
All the sounds within the dataset are under a "CC-By-4.0-attribution" license.
| dbschaeffer/schaeffer_thesis_corrected | [
"license:cc-by-4.0",
"region:us"
] | 2024-02-13T13:53:03+00:00 | {"license": "cc-by-4.0", "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "username", "dtype": "string"}, {"name": "Processes", "dtype": "string"}, {"name": "PulseTypology", "dtype": "string"}, {"name": "Complexity", "dtype": "string"}, {"name": "Onset", "dtype": "string"}, {"name": "Offset", "dtype": "string"}, {"name": "Type", "dtype": "string"}, {"name": "MassType", "dtype": "string"}, {"name": "Direction", "dtype": "string"}, {"name": "description", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1918141228, "num_examples": 788}], "download_size": 1608587794, "dataset_size": 1918141228}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T00:06:10+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
|
The SCHAEFFER dataset (Spectro-morphogical Corpus of Human-annotated Audio with Electroacoustic Features for Experimental Research), is a compilation of 788 raw audio data accompanied by human annotations and morphological acoustic features.
The audio files adhere to the concept of Sound Objects introduced by Pierre Scaheffer, a framework for the analysis and creation of sound that focuses on its typological and morphological characteristics.
Inside the dataset, the annotation are provided in the form of free text, while the labels are pre-chosen from a list of classes, making the sound description fit into a suitable framework for digital analysis.
All the sounds within the dataset are under a "CC-By-4.0-attribution" license.
| [] | [
"TAGS\n#license-cc-by-4.0 #region-us \n"
] | [
15
] | [
"passage: TAGS\n#license-cc-by-4.0 #region-us \n"
] |
9fb09026f511af0abed1be6d81c80d9d6f7c4df4 |
# DPO Binarized filtered-aya_dataset-zsm
DPO binarized style using filtered https://huggingface.co/datasets/CohereForAI/aya_dataset on `zsm` language only, after that we use https://huggingface.co/mesolitica/malaysian-mistral-7b-32k-instructions-v4 to generate the outputs and the generated outputs use `rejected` column.
Read more about DPO binarized style dataset at https://huggingface.co/docs/trl/main/en/dpo_trainer | mesolitica/DPO-filtered-aya_dataset-zsm | [
"language:ms",
"region:us"
] | 2024-02-13T13:54:27+00:00 | {"language": ["ms"], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12310875, "num_examples": 10073}], "download_size": 5813801, "dataset_size": 12310875}} | 2024-02-13T15:30:43+00:00 | [] | [
"ms"
] | TAGS
#language-Malay (macrolanguage) #region-us
|
# DPO Binarized filtered-aya_dataset-zsm
DPO binarized style using filtered URL on 'zsm' language only, after that we use URL to generate the outputs and the generated outputs use 'rejected' column.
Read more about DPO binarized style dataset at URL | [
"# DPO Binarized filtered-aya_dataset-zsm\n\nDPO binarized style using filtered URL on 'zsm' language only, after that we use URL to generate the outputs and the generated outputs use 'rejected' column.\n\nRead more about DPO binarized style dataset at URL"
] | [
"TAGS\n#language-Malay (macrolanguage) #region-us \n",
"# DPO Binarized filtered-aya_dataset-zsm\n\nDPO binarized style using filtered URL on 'zsm' language only, after that we use URL to generate the outputs and the generated outputs use 'rejected' column.\n\nRead more about DPO binarized style dataset at URL"
] | [
16,
74
] | [
"passage: TAGS\n#language-Malay (macrolanguage) #region-us \n# DPO Binarized filtered-aya_dataset-zsm\n\nDPO binarized style using filtered URL on 'zsm' language only, after that we use URL to generate the outputs and the generated outputs use 'rejected' column.\n\nRead more about DPO binarized style dataset at URL"
] |
7c998d6a0c7fafea51c7d5f19862b292151fa0bb |
# Dataset Card for Evaluation run of Himitsui/Kaiju-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Himitsui/Kaiju-11B](https://huggingface.co/Himitsui/Kaiju-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Himitsui__Kaiju-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T14:10:39.178663](https://huggingface.co/datasets/open-llm-leaderboard/details_Himitsui__Kaiju-11B/blob/main/results_2024-02-13T14-10-39.178663.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6710194744975722,
"acc_stderr": 0.031339984413299014,
"acc_norm": 0.6720247844151348,
"acc_norm_stderr": 0.03197641248956888,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6214829965226529,
"mc2_stderr": 0.015579042307560267
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441374,
"acc_norm": 0.6996587030716723,
"acc_norm_stderr": 0.01339590930995701
},
"harness|hellaswag|10": {
"acc": 0.6910973909579765,
"acc_stderr": 0.004610966122378293,
"acc_norm": 0.8772156940848437,
"acc_norm_stderr": 0.003275187310785844
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810535,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810535
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8354838709677419,
"acc_stderr": 0.021090847745939317,
"acc_norm": 0.8354838709677419,
"acc_norm_stderr": 0.021090847745939317
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634325,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634325
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700486,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700486
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8823529411764706,
"acc_stderr": 0.02261328660113202,
"acc_norm": 0.8823529411764706,
"acc_norm_stderr": 0.02261328660113202
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5039106145251396,
"acc_stderr": 0.016721990073156657,
"acc_norm": 0.5039106145251396,
"acc_norm_stderr": 0.016721990073156657
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.02417084087934086,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.02417084087934086
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7716049382716049,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.7716049382716049,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4954367666232073,
"acc_stderr": 0.012769704263117519,
"acc_norm": 0.4954367666232073,
"acc_norm_stderr": 0.012769704263117519
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.02604066247420124,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.02604066247420124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.01866335967146366,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.01866335967146366
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6214829965226529,
"mc2_stderr": 0.015579042307560267
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237426
},
"harness|gsm8k|5": {
"acc": 0.6679302501895376,
"acc_stderr": 0.012972465034361867
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Himitsui__Kaiju-11B | [
"region:us"
] | 2024-02-13T14:12:57+00:00 | {"pretty_name": "Evaluation run of Himitsui/Kaiju-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Himitsui/Kaiju-11B](https://huggingface.co/Himitsui/Kaiju-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Himitsui__Kaiju-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T14:10:39.178663](https://huggingface.co/datasets/open-llm-leaderboard/details_Himitsui__Kaiju-11B/blob/main/results_2024-02-13T14-10-39.178663.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6710194744975722,\n \"acc_stderr\": 0.031339984413299014,\n \"acc_norm\": 0.6720247844151348,\n \"acc_norm_stderr\": 0.03197641248956888,\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6214829965226529,\n \"mc2_stderr\": 0.015579042307560267\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441374,\n \"acc_norm\": 0.6996587030716723,\n \"acc_norm_stderr\": 0.01339590930995701\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6910973909579765,\n \"acc_stderr\": 0.004610966122378293,\n \"acc_norm\": 0.8772156940848437,\n \"acc_norm_stderr\": 0.003275187310785844\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810535,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810535\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8354838709677419,\n \"acc_stderr\": 0.021090847745939317,\n \"acc_norm\": 0.8354838709677419,\n \"acc_norm_stderr\": 0.021090847745939317\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634325,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634325\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700486,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700486\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8823529411764706,\n \"acc_stderr\": 0.02261328660113202,\n \"acc_norm\": 0.8823529411764706,\n \"acc_norm_stderr\": 0.02261328660113202\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5039106145251396,\n \"acc_stderr\": 0.016721990073156657,\n \"acc_norm\": 0.5039106145251396,\n \"acc_norm_stderr\": 0.016721990073156657\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.02417084087934086,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.02417084087934086\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7716049382716049,\n \"acc_stderr\": 0.023358211840626267,\n \"acc_norm\": 0.7716049382716049,\n \"acc_norm_stderr\": 0.023358211840626267\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4954367666232073,\n \"acc_stderr\": 0.012769704263117519,\n \"acc_norm\": 0.4954367666232073,\n \"acc_norm_stderr\": 0.012769704263117519\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.02604066247420124,\n \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.02604066247420124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.01866335967146366,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.01866335967146366\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6214829965226529,\n \"mc2_stderr\": 0.015579042307560267\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237426\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6679302501895376,\n \"acc_stderr\": 0.012972465034361867\n }\n}\n```", "repo_url": "https://huggingface.co/Himitsui/Kaiju-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-10-39.178663.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["**/details_harness|winogrande|5_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T14-10-39.178663.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T14_10_39.178663", "path": ["results_2024-02-13T14-10-39.178663.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T14-10-39.178663.parquet"]}]}]} | 2024-02-13T14:13:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Himitsui/Kaiju-11B
Dataset automatically created during the evaluation run of model Himitsui/Kaiju-11B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T14:10:39.178663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Himitsui/Kaiju-11B\n\n\n\nDataset automatically created during the evaluation run of model Himitsui/Kaiju-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:10:39.178663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Himitsui/Kaiju-11B\n\n\n\nDataset automatically created during the evaluation run of model Himitsui/Kaiju-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:10:39.178663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Himitsui/Kaiju-11B\n\n\n\nDataset automatically created during the evaluation run of model Himitsui/Kaiju-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T14:10:39.178663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
0464d97b931131817fa29211c7e8b4c6dd540915 |
# Dataset Card for Evaluation run of s3nh/poorx32124
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [s3nh/poorx32124](https://huggingface.co/s3nh/poorx32124) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_s3nh__poorx32124",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T14:37:56.380929](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__poorx32124/blob/main/results_2024-02-13T14-37-56.380929.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.526388001251259,
"acc_stderr": 0.03423883931564769,
"acc_norm": 0.532328851935962,
"acc_norm_stderr": 0.03497682392887963,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756544,
"mc2": 0.5026378052318262,
"mc2_stderr": 0.015007516516438155
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.01461130570505699,
"acc_norm": 0.5315699658703071,
"acc_norm_stderr": 0.014582236460866977
},
"harness|hellaswag|10": {
"acc": 0.5550687114120693,
"acc_stderr": 0.004959425421382025,
"acc_norm": 0.7358095996813384,
"acc_norm_stderr": 0.00440000082274205
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978252,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978252
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273958,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273958
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752052,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752052
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6096774193548387,
"acc_stderr": 0.027751256636969576,
"acc_norm": 0.6096774193548387,
"acc_norm_stderr": 0.027751256636969576
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.032922966391551414,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.032922966391551414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871937,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871937
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7064220183486238,
"acc_stderr": 0.01952515112263967,
"acc_norm": 0.7064220183486238,
"acc_norm_stderr": 0.01952515112263967
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.016225017944770964,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.016225017944770964
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098174,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903217,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903217
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.026981478043648043,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.026981478043648043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402605,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402605
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.02847350127296377,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.02847350127296377
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3500651890482399,
"acc_stderr": 0.012182552313215179,
"acc_norm": 0.3500651890482399,
"acc_norm_stderr": 0.012182552313215179
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032939,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032939
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5049019607843137,
"acc_stderr": 0.020226862710039463,
"acc_norm": 0.5049019607843137,
"acc_norm_stderr": 0.020226862710039463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333333,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333333
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756544,
"mc2": 0.5026378052318262,
"mc2_stderr": 0.015007516516438155
},
"harness|winogrande|5": {
"acc": 0.6937647987371744,
"acc_stderr": 0.012954385972802471
},
"harness|gsm8k|5": {
"acc": 0.21910538286580744,
"acc_stderr": 0.01139370663497807
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_s3nh__poorx32124 | [
"region:us"
] | 2024-02-13T14:34:11+00:00 | {"pretty_name": "Evaluation run of s3nh/poorx32124", "dataset_summary": "Dataset automatically created during the evaluation run of model [s3nh/poorx32124](https://huggingface.co/s3nh/poorx32124) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_s3nh__poorx32124\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T14:37:56.380929](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__poorx32124/blob/main/results_2024-02-13T14-37-56.380929.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.526388001251259,\n \"acc_stderr\": 0.03423883931564769,\n \"acc_norm\": 0.532328851935962,\n \"acc_norm_stderr\": 0.03497682392887963,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.5026378052318262,\n \"mc2_stderr\": 0.015007516516438155\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.01461130570505699,\n \"acc_norm\": 0.5315699658703071,\n \"acc_norm_stderr\": 0.014582236460866977\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5550687114120693,\n \"acc_stderr\": 0.004959425421382025,\n \"acc_norm\": 0.7358095996813384,\n \"acc_norm_stderr\": 0.00440000082274205\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978252,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978252\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.03809342081273958,\n \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.03809342081273958\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.03252909619613197,\n \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.03252909619613197\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.041657747757287644,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.041657747757287644\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752052,\n \"acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752052\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6096774193548387,\n \"acc_stderr\": 0.027751256636969576,\n \"acc_norm\": 0.6096774193548387,\n \"acc_norm_stderr\": 0.027751256636969576\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.032922966391551414,\n \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.032922966391551414\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7064220183486238,\n \"acc_stderr\": 0.01952515112263967,\n \"acc_norm\": 0.7064220183486238,\n \"acc_norm_stderr\": 0.01952515112263967\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.7435897435897436,\n \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n \"acc_stderr\": 0.016225017944770964,\n \"acc_norm\": 0.7100893997445722,\n \"acc_norm_stderr\": 0.016225017944770964\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098174,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n \"acc_stderr\": 0.015414494487903217,\n \"acc_norm\": 0.30614525139664805,\n \"acc_norm_stderr\": 0.015414494487903217\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402605,\n \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402605\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35106382978723405,\n \"acc_stderr\": 0.02847350127296377,\n \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.02847350127296377\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3500651890482399,\n \"acc_stderr\": 0.012182552313215179,\n \"acc_norm\": 0.3500651890482399,\n \"acc_norm_stderr\": 0.012182552313215179\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5049019607843137,\n \"acc_stderr\": 0.020226862710039463,\n \"acc_norm\": 0.5049019607843137,\n \"acc_norm_stderr\": 0.020226862710039463\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893782,\n \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893782\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03333333333333333,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03333333333333333\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.5026378052318262,\n \"mc2_stderr\": 0.015007516516438155\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6937647987371744,\n \"acc_stderr\": 0.012954385972802471\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21910538286580744,\n \"acc_stderr\": 0.01139370663497807\n }\n}\n```", "repo_url": "https://huggingface.co/s3nh/poorx32124", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-31-52.359427.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-37-56.380929.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["**/details_harness|winogrande|5_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["**/details_harness|winogrande|5_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T14-37-56.380929.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T14_31_52.359427", "path": ["results_2024-02-13T14-31-52.359427.parquet"]}, {"split": "2024_02_13T14_37_56.380929", "path": ["results_2024-02-13T14-37-56.380929.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T14-37-56.380929.parquet"]}]}]} | 2024-02-13T14:40:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of s3nh/poorx32124
Dataset automatically created during the evaluation run of model s3nh/poorx32124 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T14:37:56.380929(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of s3nh/poorx32124\n\n\n\nDataset automatically created during the evaluation run of model s3nh/poorx32124 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:37:56.380929(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of s3nh/poorx32124\n\n\n\nDataset automatically created during the evaluation run of model s3nh/poorx32124 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:37:56.380929(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of s3nh/poorx32124\n\n\n\nDataset automatically created during the evaluation run of model s3nh/poorx32124 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T14:37:56.380929(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
16aa5d51894e7006e62676ce5fa9936ec14f5f7b |
# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B](https://huggingface.co/TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T14:34:36.455085](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B/blob/main/results_2024-02-13T14-34-36.455085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23171568548592442,
"acc_stderr": 0.0299237713861581,
"acc_norm": 0.23221892225198718,
"acc_norm_stderr": 0.03071612341862599,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807762,
"mc2": 0.4845135742741713,
"mc2_stderr": 0.016732019889852616
},
"harness|arc:challenge|25": {
"acc": 0.2167235494880546,
"acc_stderr": 0.01204015671348119,
"acc_norm": 0.2696245733788396,
"acc_norm_stderr": 0.012968040686869159
},
"harness|hellaswag|10": {
"acc": 0.2568213503286198,
"acc_stderr": 0.004359871519639539,
"acc_norm": 0.26110336586337385,
"acc_norm_stderr": 0.004383384784038464
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398691,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398691
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069374,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322256,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322256
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.02478907133200763,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.02478907133200763
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807762,
"mc2": 0.4845135742741713,
"mc2_stderr": 0.016732019889852616
},
"harness|winogrande|5": {
"acc": 0.4909234411996843,
"acc_stderr": 0.01405017009449771
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B | [
"region:us"
] | 2024-02-13T14:36:41+00:00 | {"pretty_name": "Evaluation run of TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B", "dataset_summary": "Dataset automatically created during the evaluation run of model [TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B](https://huggingface.co/TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T14:34:36.455085](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B/blob/main/results_2024-02-13T14-34-36.455085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23171568548592442,\n \"acc_stderr\": 0.0299237713861581,\n \"acc_norm\": 0.23221892225198718,\n \"acc_norm_stderr\": 0.03071612341862599,\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807762,\n \"mc2\": 0.4845135742741713,\n \"mc2_stderr\": 0.016732019889852616\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2167235494880546,\n \"acc_stderr\": 0.01204015671348119,\n \"acc_norm\": 0.2696245733788396,\n \"acc_norm_stderr\": 0.012968040686869159\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2568213503286198,\n \"acc_stderr\": 0.004359871519639539,\n \"acc_norm\": 0.26110336586337385,\n \"acc_norm_stderr\": 0.004383384784038464\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n \"acc_stderr\": 0.015246803197398691,\n \"acc_norm\": 0.2388250319284802,\n \"acc_norm_stderr\": 0.015246803197398691\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069374,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069374\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322256,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322256\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.02478907133200763,\n \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.02478907133200763\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807762,\n \"mc2\": 0.4845135742741713,\n \"mc2_stderr\": 0.016732019889852616\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4909234411996843,\n \"acc_stderr\": 0.01405017009449771\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-34-36.455085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["**/details_harness|winogrande|5_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T14-34-36.455085.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T14_34_36.455085", "path": ["results_2024-02-13T14-34-36.455085.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T14-34-36.455085.parquet"]}]}]} | 2024-02-13T14:37:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B
Dataset automatically created during the evaluation run of model TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T14:34:36.455085(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B\n\n\n\nDataset automatically created during the evaluation run of model TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:34:36.455085(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B\n\n\n\nDataset automatically created during the evaluation run of model TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:34:36.455085(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
201,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B\n\n\n\nDataset automatically created during the evaluation run of model TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T14:34:36.455085(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
f97f7e1673360a396e65e3fc13f5583b91c0c2ae |
# Dataset Card for Evaluation run of shuvom/yuj-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shuvom/yuj-v1](https://huggingface.co/shuvom/yuj-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shuvom__yuj-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T14:45:38.399135](https://huggingface.co/datasets/open-llm-leaderboard/details_shuvom__yuj-v1/blob/main/results_2024-02-13T14-45-38.399135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4368690144541721,
"acc_stderr": 0.03434467285481173,
"acc_norm": 0.44258872704716845,
"acc_norm_stderr": 0.03518630942288087,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476194,
"mc2": 0.4169072304332008,
"mc2_stderr": 0.015264102154015026
},
"harness|arc:challenge|25": {
"acc": 0.41638225255972694,
"acc_stderr": 0.014405618279436172,
"acc_norm": 0.4564846416382253,
"acc_norm_stderr": 0.014555949760496437
},
"harness|hellaswag|10": {
"acc": 0.5310695080661223,
"acc_stderr": 0.004980138679161042,
"acc_norm": 0.700955984863573,
"acc_norm_stderr": 0.004569034613332594
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205615,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205615
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237657,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237657
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.031907012423268113,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.031907012423268113
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681855,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5252525252525253,
"acc_stderr": 0.03557806245087314,
"acc_norm": 0.5252525252525253,
"acc_norm_stderr": 0.03557806245087314
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6321243523316062,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.6321243523316062,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.02504919787604234,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.02504919787604234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6201834862385321,
"acc_stderr": 0.020808825617866244,
"acc_norm": 0.6201834862385321,
"acc_norm_stderr": 0.020808825617866244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953427,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4214876033057851,
"acc_stderr": 0.04507732278775094,
"acc_norm": 0.4214876033057851,
"acc_norm_stderr": 0.04507732278775094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.02999695185834948,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.02999695185834948
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5644955300127714,
"acc_stderr": 0.01773058992792659,
"acc_norm": 0.5644955300127714,
"acc_norm_stderr": 0.01773058992792659
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966353,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.40522875816993464,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.40522875816993464,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4919614147909968,
"acc_stderr": 0.028394421370984548,
"acc_norm": 0.4919614147909968,
"acc_norm_stderr": 0.028394421370984548
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.02774431344337654,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.02774431344337654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169934,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169934
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3604954367666232,
"acc_stderr": 0.012263110237299233,
"acc_norm": 0.3604954367666232,
"acc_norm_stderr": 0.012263110237299233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123936,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123936
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5847953216374269,
"acc_stderr": 0.03779275945503201,
"acc_norm": 0.5847953216374269,
"acc_norm_stderr": 0.03779275945503201
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476194,
"mc2": 0.4169072304332008,
"mc2_stderr": 0.015264102154015026
},
"harness|winogrande|5": {
"acc": 0.6985003946329913,
"acc_stderr": 0.012897628072546683
},
"harness|gsm8k|5": {
"acc": 0.047763457164518575,
"acc_stderr": 0.005874387536229317
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_shuvom__yuj-v1 | [
"region:us"
] | 2024-02-13T14:48:00+00:00 | {"pretty_name": "Evaluation run of shuvom/yuj-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [shuvom/yuj-v1](https://huggingface.co/shuvom/yuj-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shuvom__yuj-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T14:45:38.399135](https://huggingface.co/datasets/open-llm-leaderboard/details_shuvom__yuj-v1/blob/main/results_2024-02-13T14-45-38.399135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4368690144541721,\n \"acc_stderr\": 0.03434467285481173,\n \"acc_norm\": 0.44258872704716845,\n \"acc_norm_stderr\": 0.03518630942288087,\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.015321821688476194,\n \"mc2\": 0.4169072304332008,\n \"mc2_stderr\": 0.015264102154015026\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.41638225255972694,\n \"acc_stderr\": 0.014405618279436172,\n \"acc_norm\": 0.4564846416382253,\n \"acc_norm_stderr\": 0.014555949760496437\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5310695080661223,\n \"acc_stderr\": 0.004980138679161042,\n \"acc_norm\": 0.700955984863573,\n \"acc_norm_stderr\": 0.004569034613332594\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874143,\n \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874143\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205615,\n \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205615\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237657,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237657\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.031907012423268113,\n \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.031907012423268113\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n \"acc_stderr\": 0.028434533152681855,\n \"acc_norm\": 0.4870967741935484,\n \"acc_norm_stderr\": 0.028434533152681855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5252525252525253,\n \"acc_stderr\": 0.03557806245087314,\n \"acc_norm\": 0.5252525252525253,\n \"acc_norm_stderr\": 0.03557806245087314\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6321243523316062,\n \"acc_stderr\": 0.034801756684660366,\n \"acc_norm\": 0.6321243523316062,\n \"acc_norm_stderr\": 0.034801756684660366\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.02504919787604234,\n \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.02504919787604234\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478465,\n \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478465\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6201834862385321,\n \"acc_stderr\": 0.020808825617866244,\n \"acc_norm\": 0.6201834862385321,\n \"acc_norm_stderr\": 0.020808825617866244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550988,\n \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550988\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4214876033057851,\n \"acc_stderr\": 0.04507732278775094,\n \"acc_norm\": 0.4214876033057851,\n \"acc_norm_stderr\": 0.04507732278775094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n \"acc_stderr\": 0.02999695185834948,\n \"acc_norm\": 0.7008547008547008,\n \"acc_norm_stderr\": 0.02999695185834948\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5644955300127714,\n \"acc_stderr\": 0.01773058992792659,\n \"acc_norm\": 0.5644955300127714,\n \"acc_norm_stderr\": 0.01773058992792659\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n \"acc_stderr\": 0.014614465821966353,\n \"acc_norm\": 0.2569832402234637,\n \"acc_norm_stderr\": 0.014614465821966353\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.40522875816993464,\n \"acc_stderr\": 0.028110928492809068,\n \"acc_norm\": 0.40522875816993464,\n \"acc_norm_stderr\": 0.028110928492809068\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4919614147909968,\n \"acc_stderr\": 0.028394421370984548,\n \"acc_norm\": 0.4919614147909968,\n \"acc_norm_stderr\": 0.028394421370984548\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.02774431344337654,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.02774431344337654\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169934,\n \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169934\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3604954367666232,\n \"acc_stderr\": 0.012263110237299233,\n \"acc_norm\": 0.3604954367666232,\n \"acc_norm_stderr\": 0.012263110237299233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.0200176292142131,\n \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.0200176292142131\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.031680911612338825,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.031680911612338825\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n \"acc_stderr\": 0.03512310964123936,\n \"acc_norm\": 0.5572139303482587,\n \"acc_norm_stderr\": 0.03512310964123936\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5847953216374269,\n \"acc_stderr\": 0.03779275945503201,\n \"acc_norm\": 0.5847953216374269,\n \"acc_norm_stderr\": 0.03779275945503201\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.015321821688476194,\n \"mc2\": 0.4169072304332008,\n \"mc2_stderr\": 0.015264102154015026\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6985003946329913,\n \"acc_stderr\": 0.012897628072546683\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.047763457164518575,\n \"acc_stderr\": 0.005874387536229317\n }\n}\n```", "repo_url": "https://huggingface.co/shuvom/yuj-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-45-38.399135.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["**/details_harness|winogrande|5_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T14-45-38.399135.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T14_45_38.399135", "path": ["results_2024-02-13T14-45-38.399135.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T14-45-38.399135.parquet"]}]}]} | 2024-02-13T14:48:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of shuvom/yuj-v1
Dataset automatically created during the evaluation run of model shuvom/yuj-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T14:45:38.399135(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of shuvom/yuj-v1\n\n\n\nDataset automatically created during the evaluation run of model shuvom/yuj-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:45:38.399135(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of shuvom/yuj-v1\n\n\n\nDataset automatically created during the evaluation run of model shuvom/yuj-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:45:38.399135(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of shuvom/yuj-v1\n\n\n\nDataset automatically created during the evaluation run of model shuvom/yuj-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T14:45:38.399135(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
6719a8efa2d09c5fb63a0edae6d0c4e5d1df4ba4 |
<h1 align="center"> ⚛️ SMolInstruct </h1>
SMolInstruct is a **large-scale**, **comprehensive**, and **high-quality instruction tuning dataset** crafted for **chemistry**. It centers around small molecules, and contains 14 meticulously selected tasks and over 3M samples.
**Paper**: [LlaSMol: Advancing Large Language Models for Chemistry with a Large-Scale, Comprehensive, High-Quality Instruction Tuning Dataset](https://arxiv.org/abs/2402.09391)
**Page**: [https://osu-nlp-group.github.io/LlaSMol](https://osu-nlp-group.github.io/LlaSMol)
**Code**: [https://github.com/OSU-NLP-Group/LlaSMol](https://github.com/OSU-NLP-Group/LlaSMol)
**Models**: [https://huggingface.co/osunlp/LlaSMol](https://huggingface.co/osunlp/LlaSMol)
## 🔭 Overview
The following figure illustrates the tasks and corresponding examples.

The following table shows the tasks and statistics over the SMolInstruct dataset, where “Qry.” and “Resp.” are average lengths of queries and responses, respectively.

An example is shown below:
```python
{
'input': 'Based on the given reactants and reagents: <SMILES> CCCCCCCC/C=C\\CCCCCCCC(=O)OCCNCCOC(=O)CCCCCCC/C=C\\CCCCCCCC.CCN=C=NCCCN(C)C.CN(C)C1=CC=NC=C1.CN(C)CCSCC(=O)O.CO.Cl.ClCCl.O.O=C(O)C(F)(F)F.O=C([O-])[O-].[K+] </SMILES>, what product could potentially be produced?',
'output': 'The product can be <SMILES> CCCCCCCC/C=C\\CCCCCCCC(=O)OCCN(CCOC(=O)CCCCCCC/C=C\\CCCCCCCC)C(=O)CSCCN(C)C </SMILES> .',
'raw_input': 'CCCCCCCC/C=C\\CCCCCCCC(=O)OCCNCCOC(=O)CCCCCCC/C=C\\CCCCCCCC.CCN=C=NCCCN(C)C.CN(C)C1=CC=NC=C1.CN(C)CCSCC(=O)O.CO.Cl.ClCCl.O.O=C(O)C(F)(F)F.O=C([O-])[O-].[K+]',
'raw_output': 'CCCCCCCC/C=C\\CCCCCCCC(=O)OCCN(CCOC(=O)CCCCCCC/C=C\\CCCCCCCC)C(=O)CSCCN(C)C',
'split': 'train',
'task': 'forward_synthesis',
'input_core_tag_left': '<SMILES>',
'input_core_tag_right': '</SMILES>',
'output_core_tag_left': '<SMILES>',
'output_core_tag_right': '</SMILES>',
'target': None
}
```
## ⚔️ Usage
You can use the following lines to load the dataset:
```python
from datasets import load_dataset
dataset = load_dataset('osunlp/SMolInstruct')
train_set = dataset['train']
validation_set = dataset['validation']
test_set = dataset['test']
```
You can also specify what tasks to load:
```python
ALL_TASKS = (
'forward_synthesis',
'retrosynthesis',
'molecule_captioning',
'molecule_generation',
'name_conversion-i2f',
'name_conversion-i2s',
'name_conversion-s2f',
'name_conversion-s2i',
'property_prediction-esol',
'property_prediction-lipo',
'property_prediction-bbbp',
'property_prediction-clintox',
'property_prediction-hiv',
'property_prediction-sider',
)
train_set = load_dataset('osunlp/SMolInstruct', tasks=ALL_TASKS)
```
## 🛠️ Data Construction
The construction of SMolInstruct goes through a four-step pipeline:
- **data collection**: Collect data from various sources and organize it for the tasks.
- **quality control**: Rigorous scrutiny is applied to remove samples with chemically invalid SMILES and wrong or inaccurate information, as well as duplicated samples.
- **data splitting**: Samples are carefully splitted into train/validation/test set to avoid data leakage across tasks. Also, the splitting is compatible with previous work to faciliate fair comparison.
- **instruction construction**: We create natural and diverse templates for creating instructions. Molecular SMILES representations are canonicalized to provide a standardized data format. In addition, we use special tags to encapsulate corresponding segments (e.g., <SMILES>...</SMILES>} for SMILES, etc.) to promote model learning during training and faciliate answer extraction during inference.
## 🚨 License
The **SMolInstruct** dataset is licensed under CC BY 4.0.
We emphatically urge all users to adhere to the highest ethical standards when using our dataset, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly **forbidden**.
## 🔍 Citation
If our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries.
```
@article{yu2024llasmol,
title={LlaSMol: Advancing Large Language Models for Chemistry with a Large-Scale, Comprehensive, High-Quality Instruction Tuning Dataset},
author={Botao Yu and Frazier N. Baker and Ziqi Chen and Xia Ning and Huan Sun},
journal={arXiv preprint arXiv:2402.09391},
year={2024}
}
```
Thank you for your interest in our work.
| osunlp/SMolInstruct | [
"language:en",
"license:cc-by-4.0",
"chemistry",
"molecule",
"small molecule",
"instructions",
"arxiv:2402.09391",
"region:us"
] | 2024-02-13T14:50:59+00:00 | {"language": ["en"], "license": "cc-by-4.0", "tags": ["chemistry", "molecule", "small molecule", "instructions"]} | 2024-02-15T15:57:38+00:00 | [
"2402.09391"
] | [
"en"
] | TAGS
#language-English #license-cc-by-4.0 #chemistry #molecule #small molecule #instructions #arxiv-2402.09391 #region-us
|
<h1 align="center"> ️ SMolInstruct </h1>
SMolInstruct is a large-scale, comprehensive, and high-quality instruction tuning dataset crafted for chemistry. It centers around small molecules, and contains 14 meticulously selected tasks and over 3M samples.
Paper: LlaSMol: Advancing Large Language Models for Chemistry with a Large-Scale, Comprehensive, High-Quality Instruction Tuning Dataset
Page: URL
Code: URL
Models: URL
## Overview
The following figure illustrates the tasks and corresponding examples.
!Overview of the tasks.
The following table shows the tasks and statistics over the SMolInstruct dataset, where “Qry.” and “Resp.” are average lengths of queries and responses, respectively.
!Statistics of the SMolInstruct dataset.
An example is shown below:
## ️ Usage
You can use the following lines to load the dataset:
You can also specify what tasks to load:
## ️ Data Construction
The construction of SMolInstruct goes through a four-step pipeline:
- data collection: Collect data from various sources and organize it for the tasks.
- quality control: Rigorous scrutiny is applied to remove samples with chemically invalid SMILES and wrong or inaccurate information, as well as duplicated samples.
- data splitting: Samples are carefully splitted into train/validation/test set to avoid data leakage across tasks. Also, the splitting is compatible with previous work to faciliate fair comparison.
- instruction construction: We create natural and diverse templates for creating instructions. Molecular SMILES representations are canonicalized to provide a standardized data format. In addition, we use special tags to encapsulate corresponding segments (e.g., <SMILES>...</SMILES>} for SMILES, etc.) to promote model learning during training and faciliate answer extraction during inference.
## License
The SMolInstruct dataset is licensed under CC BY 4.0.
We emphatically urge all users to adhere to the highest ethical standards when using our dataset, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly forbidden.
## Citation
If our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries.
Thank you for your interest in our work.
| [
"## Overview\n\nThe following figure illustrates the tasks and corresponding examples.\n\n!Overview of the tasks.\n\nThe following table shows the tasks and statistics over the SMolInstruct dataset, where “Qry.” and “Resp.” are average lengths of queries and responses, respectively.\n\n!Statistics of the SMolInstruct dataset.\n\nAn example is shown below:",
"## ️ Usage\nYou can use the following lines to load the dataset:\n\n\nYou can also specify what tasks to load:",
"## ️ Data Construction\n\nThe construction of SMolInstruct goes through a four-step pipeline: \n\n- data collection: Collect data from various sources and organize it for the tasks.\n- quality control: Rigorous scrutiny is applied to remove samples with chemically invalid SMILES and wrong or inaccurate information, as well as duplicated samples.\n- data splitting: Samples are carefully splitted into train/validation/test set to avoid data leakage across tasks. Also, the splitting is compatible with previous work to faciliate fair comparison.\n- instruction construction: We create natural and diverse templates for creating instructions. Molecular SMILES representations are canonicalized to provide a standardized data format. In addition, we use special tags to encapsulate corresponding segments (e.g., <SMILES>...</SMILES>} for SMILES, etc.) to promote model learning during training and faciliate answer extraction during inference.",
"## License\n\nThe SMolInstruct dataset is licensed under CC BY 4.0.\n\nWe emphatically urge all users to adhere to the highest ethical standards when using our dataset, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly forbidden.",
"## Citation\nIf our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries.\n\n\n\nThank you for your interest in our work."
] | [
"TAGS\n#language-English #license-cc-by-4.0 #chemistry #molecule #small molecule #instructions #arxiv-2402.09391 #region-us \n",
"## Overview\n\nThe following figure illustrates the tasks and corresponding examples.\n\n!Overview of the tasks.\n\nThe following table shows the tasks and statistics over the SMolInstruct dataset, where “Qry.” and “Resp.” are average lengths of queries and responses, respectively.\n\n!Statistics of the SMolInstruct dataset.\n\nAn example is shown below:",
"## ️ Usage\nYou can use the following lines to load the dataset:\n\n\nYou can also specify what tasks to load:",
"## ️ Data Construction\n\nThe construction of SMolInstruct goes through a four-step pipeline: \n\n- data collection: Collect data from various sources and organize it for the tasks.\n- quality control: Rigorous scrutiny is applied to remove samples with chemically invalid SMILES and wrong or inaccurate information, as well as duplicated samples.\n- data splitting: Samples are carefully splitted into train/validation/test set to avoid data leakage across tasks. Also, the splitting is compatible with previous work to faciliate fair comparison.\n- instruction construction: We create natural and diverse templates for creating instructions. Molecular SMILES representations are canonicalized to provide a standardized data format. In addition, we use special tags to encapsulate corresponding segments (e.g., <SMILES>...</SMILES>} for SMILES, etc.) to promote model learning during training and faciliate answer extraction during inference.",
"## License\n\nThe SMolInstruct dataset is licensed under CC BY 4.0.\n\nWe emphatically urge all users to adhere to the highest ethical standards when using our dataset, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly forbidden.",
"## Citation\nIf our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries.\n\n\n\nThank you for your interest in our work."
] | [
43,
87,
28,
217,
81,
44
] | [
"passage: TAGS\n#language-English #license-cc-by-4.0 #chemistry #molecule #small molecule #instructions #arxiv-2402.09391 #region-us \n## Overview\n\nThe following figure illustrates the tasks and corresponding examples.\n\n!Overview of the tasks.\n\nThe following table shows the tasks and statistics over the SMolInstruct dataset, where “Qry.” and “Resp.” are average lengths of queries and responses, respectively.\n\n!Statistics of the SMolInstruct dataset.\n\nAn example is shown below:## ️ Usage\nYou can use the following lines to load the dataset:\n\n\nYou can also specify what tasks to load:## ️ Data Construction\n\nThe construction of SMolInstruct goes through a four-step pipeline: \n\n- data collection: Collect data from various sources and organize it for the tasks.\n- quality control: Rigorous scrutiny is applied to remove samples with chemically invalid SMILES and wrong or inaccurate information, as well as duplicated samples.\n- data splitting: Samples are carefully splitted into train/validation/test set to avoid data leakage across tasks. Also, the splitting is compatible with previous work to faciliate fair comparison.\n- instruction construction: We create natural and diverse templates for creating instructions. Molecular SMILES representations are canonicalized to provide a standardized data format. In addition, we use special tags to encapsulate corresponding segments (e.g., <SMILES>...</SMILES>} for SMILES, etc.) to promote model learning during training and faciliate answer extraction during inference.## License\n\nThe SMolInstruct dataset is licensed under CC BY 4.0.\n\nWe emphatically urge all users to adhere to the highest ethical standards when using our dataset, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly forbidden.## Citation\nIf our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries.\n\n\n\nThank you for your interest in our work."
] |
dae489926b202e2c301bdeafccb696faf4e59b18 |
# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/AlloyIngotNeo](https://huggingface.co/nlpguy/AlloyIngotNeo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__AlloyIngotNeo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T14:49:47.237954](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__AlloyIngotNeo/blob/main/results_2024-02-13T14-49-47.237954.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.652203561260724,
"acc_stderr": 0.03198137273405764,
"acc_norm": 0.6515967475168507,
"acc_norm_stderr": 0.03264850422564272,
"mc1": 0.6107711138310894,
"mc1_stderr": 0.017068552680690338,
"mc2": 0.7594522057332478,
"mc2_stderr": 0.014122524206259661
},
"harness|arc:challenge|25": {
"acc": 0.7073378839590444,
"acc_stderr": 0.013295916103619422,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7140011949810795,
"acc_stderr": 0.004509652679395677,
"acc_norm": 0.8898625771758614,
"acc_norm_stderr": 0.00312421161719886
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903347,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45027932960893857,
"acc_stderr": 0.016639615236845807,
"acc_norm": 0.45027932960893857,
"acc_norm_stderr": 0.016639615236845807
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6107711138310894,
"mc1_stderr": 0.017068552680690338,
"mc2": 0.7594522057332478,
"mc2_stderr": 0.014122524206259661
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598484
},
"harness|gsm8k|5": {
"acc": 0.6944655041698257,
"acc_stderr": 0.012688134076726882
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_nlpguy__AlloyIngotNeo | [
"region:us"
] | 2024-02-13T14:52:05+00:00 | {"pretty_name": "Evaluation run of nlpguy/AlloyIngotNeo", "dataset_summary": "Dataset automatically created during the evaluation run of model [nlpguy/AlloyIngotNeo](https://huggingface.co/nlpguy/AlloyIngotNeo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__AlloyIngotNeo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T14:49:47.237954](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__AlloyIngotNeo/blob/main/results_2024-02-13T14-49-47.237954.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.652203561260724,\n \"acc_stderr\": 0.03198137273405764,\n \"acc_norm\": 0.6515967475168507,\n \"acc_norm_stderr\": 0.03264850422564272,\n \"mc1\": 0.6107711138310894,\n \"mc1_stderr\": 0.017068552680690338,\n \"mc2\": 0.7594522057332478,\n \"mc2_stderr\": 0.014122524206259661\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7073378839590444,\n \"acc_stderr\": 0.013295916103619422,\n \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7140011949810795,\n \"acc_stderr\": 0.004509652679395677,\n \"acc_norm\": 0.8898625771758614,\n \"acc_norm_stderr\": 0.00312421161719886\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n \"acc_stderr\": 0.016639615236845807,\n \"acc_norm\": 0.45027932960893857,\n \"acc_norm_stderr\": 0.016639615236845807\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6107711138310894,\n \"mc1_stderr\": 0.017068552680690338,\n \"mc2\": 0.7594522057332478,\n \"mc2_stderr\": 0.014122524206259661\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598484\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6944655041698257,\n \"acc_stderr\": 0.012688134076726882\n }\n}\n```", "repo_url": "https://huggingface.co/nlpguy/AlloyIngotNeo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-49-47.237954.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["**/details_harness|winogrande|5_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T14-49-47.237954.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T14_49_47.237954", "path": ["results_2024-02-13T14-49-47.237954.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T14-49-47.237954.parquet"]}]}]} | 2024-02-13T14:52:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeo
Dataset automatically created during the evaluation run of model nlpguy/AlloyIngotNeo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T14:49:47.237954(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeo\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/AlloyIngotNeo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:49:47.237954(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeo\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/AlloyIngotNeo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:49:47.237954(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeo\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/AlloyIngotNeo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T14:49:47.237954(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
03d830930f380a3a4a93e6a5a92b69a53d477518 | # Dataset Card for "5k-govreport-4096"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | anumafzal94/5k-govreport-4096 | [
"region:us"
] | 2024-02-13T14:58:33+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 51579777, "num_examples": 972}, {"name": "train", "num_bytes": 62579581.31224934, "num_examples": 1148}, {"name": "validation", "num_bytes": 2871243.672839506, "num_examples": 50}], "download_size": 32692704, "dataset_size": 117030601.98508884}} | 2024-02-13T14:58:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "5k-govreport-4096"
More Information needed | [
"# Dataset Card for \"5k-govreport-4096\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"5k-govreport-4096\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"5k-govreport-4096\"\n\nMore Information needed"
] |
d9c3ef7803365e7556bee8591a61251103fefe49 | # Twitter Neighbours dataset
This repository contains the dataset assembled as part of a APPRAISE project (H2020-SU-SEC-2020H2020-SU-SEC-2020)
# Description
The dataset contains a sampled graph, extracted from Twitter, starting from a list of seed users.
Each user is initially represented by a semantic embedding (vector) computed as the average text embedding of a sample of its tweets, while users are connected with each other through the 'following/follower' property on Twitter.
# Statistics
- number of users: 36122
- number of edges: 84026
- user initial embeddings size: 768
# Files:
graph_train_and_test.pygeodata: is a compact representation of the graph for PyG usage.
twitter_neighs_graph.json: a dictionary containing:
- 'adj_sparse': adjacency matrix in sparse representation
- 'user_init_embs': initial user embedding, computed as average of text embedding of user tweets
- 'train_test_split': 0/1 1D list, where 0 represents the user is in teh training set and 1 the user is in the test set | links-ads/Twitter-Neighbours | [
"region:us"
] | 2024-02-13T14:58:37+00:00 | {} | 2024-02-14T12:03:56+00:00 | [] | [] | TAGS
#region-us
| # Twitter Neighbours dataset
This repository contains the dataset assembled as part of a APPRAISE project (H2020-SU-SEC-2020H2020-SU-SEC-2020)
# Description
The dataset contains a sampled graph, extracted from Twitter, starting from a list of seed users.
Each user is initially represented by a semantic embedding (vector) computed as the average text embedding of a sample of its tweets, while users are connected with each other through the 'following/follower' property on Twitter.
# Statistics
- number of users: 36122
- number of edges: 84026
- user initial embeddings size: 768
# Files:
graph_train_and_test.pygeodata: is a compact representation of the graph for PyG usage.
twitter_neighs_graph.json: a dictionary containing:
- 'adj_sparse': adjacency matrix in sparse representation
- 'user_init_embs': initial user embedding, computed as average of text embedding of user tweets
- 'train_test_split': 0/1 1D list, where 0 represents the user is in teh training set and 1 the user is in the test set | [
"# Twitter Neighbours dataset\n\nThis repository contains the dataset assembled as part of a APPRAISE project (H2020-SU-SEC-2020H2020-SU-SEC-2020)",
"# Description\nThe dataset contains a sampled graph, extracted from Twitter, starting from a list of seed users.\nEach user is initially represented by a semantic embedding (vector) computed as the average text embedding of a sample of its tweets, while users are connected with each other through the 'following/follower' property on Twitter.",
"# Statistics\n- number of users: 36122\n- number of edges: 84026\n- user initial embeddings size: 768",
"# Files:\ngraph_train_and_test.pygeodata: is a compact representation of the graph for PyG usage.\n\ntwitter_neighs_graph.json: a dictionary containing:\n - 'adj_sparse': adjacency matrix in sparse representation\n - 'user_init_embs': initial user embedding, computed as average of text embedding of user tweets\n - 'train_test_split': 0/1 1D list, where 0 represents the user is in teh training set and 1 the user is in the test set"
] | [
"TAGS\n#region-us \n",
"# Twitter Neighbours dataset\n\nThis repository contains the dataset assembled as part of a APPRAISE project (H2020-SU-SEC-2020H2020-SU-SEC-2020)",
"# Description\nThe dataset contains a sampled graph, extracted from Twitter, starting from a list of seed users.\nEach user is initially represented by a semantic embedding (vector) computed as the average text embedding of a sample of its tweets, while users are connected with each other through the 'following/follower' property on Twitter.",
"# Statistics\n- number of users: 36122\n- number of edges: 84026\n- user initial embeddings size: 768",
"# Files:\ngraph_train_and_test.pygeodata: is a compact representation of the graph for PyG usage.\n\ntwitter_neighs_graph.json: a dictionary containing:\n - 'adj_sparse': adjacency matrix in sparse representation\n - 'user_init_embs': initial user embedding, computed as average of text embedding of user tweets\n - 'train_test_split': 0/1 1D list, where 0 represents the user is in teh training set and 1 the user is in the test set"
] | [
6,
46,
83,
30,
138
] | [
"passage: TAGS\n#region-us \n# Twitter Neighbours dataset\n\nThis repository contains the dataset assembled as part of a APPRAISE project (H2020-SU-SEC-2020H2020-SU-SEC-2020)# Description\nThe dataset contains a sampled graph, extracted from Twitter, starting from a list of seed users.\nEach user is initially represented by a semantic embedding (vector) computed as the average text embedding of a sample of its tweets, while users are connected with each other through the 'following/follower' property on Twitter.# Statistics\n- number of users: 36122\n- number of edges: 84026\n- user initial embeddings size: 768# Files:\ngraph_train_and_test.pygeodata: is a compact representation of the graph for PyG usage.\n\ntwitter_neighs_graph.json: a dictionary containing:\n - 'adj_sparse': adjacency matrix in sparse representation\n - 'user_init_embs': initial user embedding, computed as average of text embedding of user tweets\n - 'train_test_split': 0/1 1D list, where 0 represents the user is in teh training set and 1 the user is in the test set"
] |
b40f5d0377d10853bd6e39a0503830c476442a11 | # Dataset Card for "Test_Dataset_0213_hf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ouvic215/Test_Dataset_0213_hf | [
"region:us"
] | 2024-02-13T14:59:47+00:00 | {"dataset_info": {"features": [{"name": "mask_image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 147332332.0, "num_examples": 1588}], "download_size": 146499523, "dataset_size": 147332332.0}} | 2024-02-13T15:00:10+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Test_Dataset_0213_hf"
More Information needed | [
"# Dataset Card for \"Test_Dataset_0213_hf\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Test_Dataset_0213_hf\"\n\nMore Information needed"
] | [
6,
20
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"Test_Dataset_0213_hf\"\n\nMore Information needed"
] |
3637a08f662adc96e473a65d0e4be40d2db861ea |
# Dataset Card for Evaluation run of nlpguy/AlloyIngot
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/AlloyIngot](https://huggingface.co/nlpguy/AlloyIngot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__AlloyIngot",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T14:57:48.240090](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__AlloyIngot/blob/main/results_2024-02-13T14-57-48.240090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6546973152860399,
"acc_stderr": 0.03202048867012665,
"acc_norm": 0.6539786181983979,
"acc_norm_stderr": 0.03269225603389572,
"mc1": 0.6046511627906976,
"mc1_stderr": 0.017115815632418208,
"mc2": 0.7511621845735572,
"mc2_stderr": 0.01426339938256465
},
"harness|arc:challenge|25": {
"acc": 0.7192832764505119,
"acc_stderr": 0.013131238126975578,
"acc_norm": 0.7397610921501706,
"acc_norm_stderr": 0.01282193022511257
},
"harness|hellaswag|10": {
"acc": 0.7204740091615216,
"acc_stderr": 0.0044784916978912286,
"acc_norm": 0.8904600677155945,
"acc_norm_stderr": 0.003116771577319422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45027932960893857,
"acc_stderr": 0.01663961523684581,
"acc_norm": 0.45027932960893857,
"acc_norm_stderr": 0.01663961523684581
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6046511627906976,
"mc1_stderr": 0.017115815632418208,
"mc2": 0.7511621845735572,
"mc2_stderr": 0.01426339938256465
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.01001259880562729
},
"harness|gsm8k|5": {
"acc": 0.6914329037149356,
"acc_stderr": 0.012723076049815896
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_nlpguy__AlloyIngot | [
"region:us"
] | 2024-02-13T15:00:09+00:00 | {"pretty_name": "Evaluation run of nlpguy/AlloyIngot", "dataset_summary": "Dataset automatically created during the evaluation run of model [nlpguy/AlloyIngot](https://huggingface.co/nlpguy/AlloyIngot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__AlloyIngot\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T14:57:48.240090](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__AlloyIngot/blob/main/results_2024-02-13T14-57-48.240090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546973152860399,\n \"acc_stderr\": 0.03202048867012665,\n \"acc_norm\": 0.6539786181983979,\n \"acc_norm_stderr\": 0.03269225603389572,\n \"mc1\": 0.6046511627906976,\n \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7511621845735572,\n \"mc2_stderr\": 0.01426339938256465\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7192832764505119,\n \"acc_stderr\": 0.013131238126975578,\n \"acc_norm\": 0.7397610921501706,\n \"acc_norm_stderr\": 0.01282193022511257\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7204740091615216,\n \"acc_stderr\": 0.0044784916978912286,\n \"acc_norm\": 0.8904600677155945,\n \"acc_norm_stderr\": 0.003116771577319422\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n \"acc_stderr\": 0.01663961523684581,\n \"acc_norm\": 0.45027932960893857,\n \"acc_norm_stderr\": 0.01663961523684581\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6046511627906976,\n \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7511621845735572,\n \"mc2_stderr\": 0.01426339938256465\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.01001259880562729\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6914329037149356,\n \"acc_stderr\": 0.012723076049815896\n }\n}\n```", "repo_url": "https://huggingface.co/nlpguy/AlloyIngot", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T14-57-48.240090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["**/details_harness|winogrande|5_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T14-57-48.240090.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T14_57_48.240090", "path": ["results_2024-02-13T14-57-48.240090.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T14-57-48.240090.parquet"]}]}]} | 2024-02-13T15:00:31+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of nlpguy/AlloyIngot
Dataset automatically created during the evaluation run of model nlpguy/AlloyIngot on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T14:57:48.240090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of nlpguy/AlloyIngot\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/AlloyIngot on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:57:48.240090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nlpguy/AlloyIngot\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/AlloyIngot on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T14:57:48.240090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nlpguy/AlloyIngot\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/AlloyIngot on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T14:57:48.240090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
6cdbe088a6d54ac24b54c0b386fc95b65d72d97a | # Wikipedia Dump with Gold Documents from Natural Questions
## Dataset Summary
This dataset combines the English Wikipedia dump from December 20, 2018, with gold passages from the [Natural Questions](https://huggingface.co/datasets/natural_questions) (NQ) dataset,
specifically tailored for open-domain question answering tasks. By integrating gold documents corresponding to each query in the [NQ-open](https://huggingface.co/datasets/nq_open)
version of the dataset, this resource addresses potential mismatches between the Wikipedia dump and the question-answer pairs found in NQ-open.
Such mismatches can lead to scenarios where the dump does not contain the required answer.
A thorough process of duplicate filtering was applied to ensure the precise identification of the gold document for each query,
enhancing the reliability of the dataset for natural language processing tasks.
Therefore, the dataset can be employed as a knowledge base for RAG systems.
One critical aspect of dataset preparation involved addressing the constraints posed by Large Language Models (LLMs) regarding input size.
LLMs, particularly when processing multiple documents in a single prompt, face limitations on the length of input they can efficiently handle.
To accommodate this, gold documents exceeding 512 tokens ([tokenized with Llama2](https://huggingface.co/docs/transformers/model_doc/llama2#transformers.LlamaTokenizer))
were excluded from the dataset. This decision was guided by the objective of maximizing the number of documents that can be included in the LLM's prompt
without compromising on the detail or context provided by each document.
As a result, the final dataset encompasses **21,035,236** documents (13.9 GB).
## Dataset Sources
- **Original Wikipedia Dump**: The corpus originates from the English Wikipedia dump, where articles are segmented into non-overlapping passages of 100 words.
[Download link](https://dl.fbaipublicfiles.com/dpr/wikipedia_split/psgs_w100.tsv.gz).
- **Gold Passages**: Sourced from the Natural Questions dataset, these passages are integrated to provide a comprehensive resource for question answering.
The gold passages are accessible through the following URLs:
- [train](https://dl.fbaipublicfiles.com/dpr/data/nq_gold_info/nq-train_gold_info.json.gz)
- [dev](https://dl.fbaipublicfiles.com/dpr/data/nq_gold_info/nq-dev_gold_info.json.gz)
- [test](https://dl.fbaipublicfiles.com/dpr/data/nq_gold_info/nq-test_gold_info.json.gz)
The above data comes from the Dense Passage Retrieval (DPR) [github repository](https://github.com/facebookresearch/DPR/blob/main/dpr/data/download_data.py).
## Dataset Structure
An example of a Wikipedia passage is as follows:
```
{
"text": Home computers were a class of microcomputers entering the market in 1977, and becoming common during the 1980s.
They were marketed to consumers as affordable and accessible computers that, for the first time, were intended for the use of a single nontechnical user.
These computers were a distinct market segment that typically cost much less than business,
scientific or engineering-oriented computers of the time such as the IBM PC, and were generally less powerful in terms of memory and expandability.
However, a home computer often had better graphics and sound than contemporary business computers. Their most common uses were playing
"title": "Home computer"
}
``` | florin-hf/wiki_dump2018_nq_open | [
"task_categories:question-answering",
"size_categories:10M<n<100M",
"language:en",
"region:us"
] | 2024-02-13T15:40:20+00:00 | {"language": ["en"], "size_categories": ["10M<n<100M"], "task_categories": ["question-answering"], "pretty_name": "v"} | 2024-02-16T09:30:57+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-10M<n<100M #language-English #region-us
| # Wikipedia Dump with Gold Documents from Natural Questions
## Dataset Summary
This dataset combines the English Wikipedia dump from December 20, 2018, with gold passages from the Natural Questions (NQ) dataset,
specifically tailored for open-domain question answering tasks. By integrating gold documents corresponding to each query in the NQ-open
version of the dataset, this resource addresses potential mismatches between the Wikipedia dump and the question-answer pairs found in NQ-open.
Such mismatches can lead to scenarios where the dump does not contain the required answer.
A thorough process of duplicate filtering was applied to ensure the precise identification of the gold document for each query,
enhancing the reliability of the dataset for natural language processing tasks.
Therefore, the dataset can be employed as a knowledge base for RAG systems.
One critical aspect of dataset preparation involved addressing the constraints posed by Large Language Models (LLMs) regarding input size.
LLMs, particularly when processing multiple documents in a single prompt, face limitations on the length of input they can efficiently handle.
To accommodate this, gold documents exceeding 512 tokens (tokenized with Llama2)
were excluded from the dataset. This decision was guided by the objective of maximizing the number of documents that can be included in the LLM's prompt
without compromising on the detail or context provided by each document.
As a result, the final dataset encompasses 21,035,236 documents (13.9 GB).
## Dataset Sources
- Original Wikipedia Dump: The corpus originates from the English Wikipedia dump, where articles are segmented into non-overlapping passages of 100 words.
Download link.
- Gold Passages: Sourced from the Natural Questions dataset, these passages are integrated to provide a comprehensive resource for question answering.
The gold passages are accessible through the following URLs:
- train
- dev
- test
The above data comes from the Dense Passage Retrieval (DPR) github repository.
## Dataset Structure
An example of a Wikipedia passage is as follows:
| [
"# Wikipedia Dump with Gold Documents from Natural Questions",
"## Dataset Summary\nThis dataset combines the English Wikipedia dump from December 20, 2018, with gold passages from the Natural Questions (NQ) dataset, \nspecifically tailored for open-domain question answering tasks. By integrating gold documents corresponding to each query in the NQ-open\nversion of the dataset, this resource addresses potential mismatches between the Wikipedia dump and the question-answer pairs found in NQ-open. \nSuch mismatches can lead to scenarios where the dump does not contain the required answer. \nA thorough process of duplicate filtering was applied to ensure the precise identification of the gold document for each query, \nenhancing the reliability of the dataset for natural language processing tasks.\n\n\nTherefore, the dataset can be employed as a knowledge base for RAG systems.\nOne critical aspect of dataset preparation involved addressing the constraints posed by Large Language Models (LLMs) regarding input size. \nLLMs, particularly when processing multiple documents in a single prompt, face limitations on the length of input they can efficiently handle. \nTo accommodate this, gold documents exceeding 512 tokens (tokenized with Llama2)\nwere excluded from the dataset. This decision was guided by the objective of maximizing the number of documents that can be included in the LLM's prompt \nwithout compromising on the detail or context provided by each document. \nAs a result, the final dataset encompasses 21,035,236 documents (13.9 GB).",
"## Dataset Sources\n\n- Original Wikipedia Dump: The corpus originates from the English Wikipedia dump, where articles are segmented into non-overlapping passages of 100 words.\nDownload link.\n\n- Gold Passages: Sourced from the Natural Questions dataset, these passages are integrated to provide a comprehensive resource for question answering.\nThe gold passages are accessible through the following URLs:\n - train\n - dev\n - test\n\nThe above data comes from the Dense Passage Retrieval (DPR) github repository.",
"## Dataset Structure\n\nAn example of a Wikipedia passage is as follows:"
] | [
"TAGS\n#task_categories-question-answering #size_categories-10M<n<100M #language-English #region-us \n",
"# Wikipedia Dump with Gold Documents from Natural Questions",
"## Dataset Summary\nThis dataset combines the English Wikipedia dump from December 20, 2018, with gold passages from the Natural Questions (NQ) dataset, \nspecifically tailored for open-domain question answering tasks. By integrating gold documents corresponding to each query in the NQ-open\nversion of the dataset, this resource addresses potential mismatches between the Wikipedia dump and the question-answer pairs found in NQ-open. \nSuch mismatches can lead to scenarios where the dump does not contain the required answer. \nA thorough process of duplicate filtering was applied to ensure the precise identification of the gold document for each query, \nenhancing the reliability of the dataset for natural language processing tasks.\n\n\nTherefore, the dataset can be employed as a knowledge base for RAG systems.\nOne critical aspect of dataset preparation involved addressing the constraints posed by Large Language Models (LLMs) regarding input size. \nLLMs, particularly when processing multiple documents in a single prompt, face limitations on the length of input they can efficiently handle. \nTo accommodate this, gold documents exceeding 512 tokens (tokenized with Llama2)\nwere excluded from the dataset. This decision was guided by the objective of maximizing the number of documents that can be included in the LLM's prompt \nwithout compromising on the detail or context provided by each document. \nAs a result, the final dataset encompasses 21,035,236 documents (13.9 GB).",
"## Dataset Sources\n\n- Original Wikipedia Dump: The corpus originates from the English Wikipedia dump, where articles are segmented into non-overlapping passages of 100 words.\nDownload link.\n\n- Gold Passages: Sourced from the Natural Questions dataset, these passages are integrated to provide a comprehensive resource for question answering.\nThe gold passages are accessible through the following URLs:\n - train\n - dev\n - test\n\nThe above data comes from the Dense Passage Retrieval (DPR) github repository.",
"## Dataset Structure\n\nAn example of a Wikipedia passage is as follows:"
] | [
34,
12,
333,
114,
17
] | [
"passage: TAGS\n#task_categories-question-answering #size_categories-10M<n<100M #language-English #region-us \n# Wikipedia Dump with Gold Documents from Natural Questions## Dataset Summary\nThis dataset combines the English Wikipedia dump from December 20, 2018, with gold passages from the Natural Questions (NQ) dataset, \nspecifically tailored for open-domain question answering tasks. By integrating gold documents corresponding to each query in the NQ-open\nversion of the dataset, this resource addresses potential mismatches between the Wikipedia dump and the question-answer pairs found in NQ-open. \nSuch mismatches can lead to scenarios where the dump does not contain the required answer. \nA thorough process of duplicate filtering was applied to ensure the precise identification of the gold document for each query, \nenhancing the reliability of the dataset for natural language processing tasks.\n\n\nTherefore, the dataset can be employed as a knowledge base for RAG systems.\nOne critical aspect of dataset preparation involved addressing the constraints posed by Large Language Models (LLMs) regarding input size. \nLLMs, particularly when processing multiple documents in a single prompt, face limitations on the length of input they can efficiently handle. \nTo accommodate this, gold documents exceeding 512 tokens (tokenized with Llama2)\nwere excluded from the dataset. This decision was guided by the objective of maximizing the number of documents that can be included in the LLM's prompt \nwithout compromising on the detail or context provided by each document. \nAs a result, the final dataset encompasses 21,035,236 documents (13.9 GB).## Dataset Sources\n\n- Original Wikipedia Dump: The corpus originates from the English Wikipedia dump, where articles are segmented into non-overlapping passages of 100 words.\nDownload link.\n\n- Gold Passages: Sourced from the Natural Questions dataset, these passages are integrated to provide a comprehensive resource for question answering.\nThe gold passages are accessible through the following URLs:\n - train\n - dev\n - test\n\nThe above data comes from the Dense Passage Retrieval (DPR) github repository."
] |
5d96436ede1d3c4b5153918c872a5584fe8c2740 |
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v3.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v3.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v3.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T15:49:58.893408](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3/blob/main/results_2024-02-13T15-49-58.893408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6493302568511367,
"acc_stderr": 0.032148382244220834,
"acc_norm": 0.648903728648512,
"acc_norm_stderr": 0.03281862542360137,
"mc1": 0.6389228886168911,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.7900424254093005,
"mc2_stderr": 0.013557770618845038
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.013385021637313572,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.7126070503883688,
"acc_stderr": 0.004516215206715359,
"acc_norm": 0.8888667596096396,
"acc_norm_stderr": 0.003136547276689888
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.02537952491077839,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.02537952491077839
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135367,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135367
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.016611393687268588,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.016611393687268588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079069,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6389228886168911,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.7900424254093005,
"mc2_stderr": 0.013557770618845038
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873518
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3 | [
"region:us"
] | 2024-02-13T15:52:17+00:00 | {"pretty_name": "Evaluation run of bardsai/jaskier-7b-dpo-v3.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v3.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v3.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T15:49:58.893408](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3/blob/main/results_2024-02-13T15-49-58.893408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6493302568511367,\n \"acc_stderr\": 0.032148382244220834,\n \"acc_norm\": 0.648903728648512,\n \"acc_norm_stderr\": 0.03281862542360137,\n \"mc1\": 0.6389228886168911,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.7900424254093005,\n \"mc2_stderr\": 0.013557770618845038\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059374\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7126070503883688,\n \"acc_stderr\": 0.004516215206715359,\n \"acc_norm\": 0.8888667596096396,\n \"acc_norm_stderr\": 0.003136547276689888\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.02537952491077839,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.02537952491077839\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135367,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135367\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n \"acc_stderr\": 0.016611393687268588,\n \"acc_norm\": 0.4424581005586592,\n \"acc_norm_stderr\": 0.016611393687268588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079069,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079069\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6389228886168911,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.7900424254093005,\n \"mc2_stderr\": 0.013557770618845038\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873518\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \"acc_stderr\": 0.012864471384836703\n }\n}\n```", "repo_url": "https://huggingface.co/bardsai/jaskier-7b-dpo-v3.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|arc:challenge|25_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|gsm8k|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hellaswag|10_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T15-49-58.893408.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["**/details_harness|winogrande|5_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T15-49-58.893408.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T15_49_58.893408", "path": ["results_2024-02-13T15-49-58.893408.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T15-49-58.893408.parquet"]}]}]} | 2024-02-13T15:52:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v3.3
Dataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v3.3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T15:49:58.893408(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v3.3\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v3.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T15:49:58.893408(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v3.3\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v3.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T15:49:58.893408(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v3.3\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v3.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T15:49:58.893408(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
b4a4c758a757884437bb563782ee0d0b2c28a639 |
# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-full
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-sft-full](https://huggingface.co/DatPySci/pythia-1b-sft-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DatPySci__pythia-1b-sft-full",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T16:10:25.536341](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-sft-full/blob/main/results_2024-02-13T16-10-25.536341.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2437500378442946,
"acc_stderr": 0.030213863245287735,
"acc_norm": 0.24468974101675026,
"acc_norm_stderr": 0.03094305925119546,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023496,
"mc2": 0.37081334738032573,
"mc2_stderr": 0.014356461899633393
},
"harness|arc:challenge|25": {
"acc": 0.27303754266211605,
"acc_stderr": 0.013019332762635753,
"acc_norm": 0.295221843003413,
"acc_norm_stderr": 0.013329750293382316
},
"harness|hellaswag|10": {
"acc": 0.38697470623381797,
"acc_stderr": 0.004860623733461137,
"acc_norm": 0.48914558852818163,
"acc_norm_stderr": 0.004988605498273906
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.0301675334686327,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.0301675334686327
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708094,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708094
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.0339175032232166,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.0339175032232166
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162452,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162452
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1793103448275862,
"acc_stderr": 0.031967664333731875,
"acc_norm": 0.1793103448275862,
"acc_norm_stderr": 0.031967664333731875
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604673
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.023904914311782644,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.023904914311782644
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.028247350122180253,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.028247350122180253
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.19393939393939394,
"acc_stderr": 0.0308741451365621,
"acc_norm": 0.19393939393939394,
"acc_norm_stderr": 0.0308741451365621
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538808,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.02136202772522272,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.02136202772522272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823019,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823019
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.02626502460827589,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.02626502460827589
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28623853211009176,
"acc_stderr": 0.019379436628919965,
"acc_norm": 0.28623853211009176,
"acc_norm_stderr": 0.019379436628919965
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.03256850570293648,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.03256850570293648
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615771,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094472,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.015866243073215054,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.015866243073215054
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.01440029642922563,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.01440029642922563
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341012,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341012
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290413,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290413
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.01090628261798164,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.01090628261798164
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2977941176470588,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.2977941176470588,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.024789071332007643,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.024789071332007643
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197772,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197772
},
"harness|hendrycksTest-virology|5": {
"acc": 0.24096385542168675,
"acc_stderr": 0.0332939411907353,
"acc_norm": 0.24096385542168675,
"acc_norm_stderr": 0.0332939411907353
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023496,
"mc2": 0.37081334738032573,
"mc2_stderr": 0.014356461899633393
},
"harness|winogrande|5": {
"acc": 0.5367008681925809,
"acc_stderr": 0.014014578458843262
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.0038289829787357134
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DatPySci__pythia-1b-sft-full | [
"region:us"
] | 2024-02-13T16:12:08+00:00 | {"pretty_name": "Evaluation run of DatPySci/pythia-1b-sft-full", "dataset_summary": "Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-sft-full](https://huggingface.co/DatPySci/pythia-1b-sft-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DatPySci__pythia-1b-sft-full\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T16:10:25.536341](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-sft-full/blob/main/results_2024-02-13T16-10-25.536341.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2437500378442946,\n \"acc_stderr\": 0.030213863245287735,\n \"acc_norm\": 0.24468974101675026,\n \"acc_norm_stderr\": 0.03094305925119546,\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023496,\n \"mc2\": 0.37081334738032573,\n \"mc2_stderr\": 0.014356461899633393\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.27303754266211605,\n \"acc_stderr\": 0.013019332762635753,\n \"acc_norm\": 0.295221843003413,\n \"acc_norm_stderr\": 0.013329750293382316\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38697470623381797,\n \"acc_stderr\": 0.004860623733461137,\n \"acc_norm\": 0.48914558852818163,\n \"acc_norm_stderr\": 0.004988605498273906\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.0339175032232166,\n \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.0339175032232166\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162452,\n \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162452\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.1793103448275862,\n \"acc_stderr\": 0.031967664333731875,\n \"acc_norm\": 0.1793103448275862,\n \"acc_norm_stderr\": 0.031967664333731875\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604673,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604673\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n \"acc_stderr\": 0.023904914311782644,\n \"acc_norm\": 0.22903225806451613,\n \"acc_norm_stderr\": 0.023904914311782644\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2019704433497537,\n \"acc_stderr\": 0.028247350122180253,\n \"acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.028247350122180253\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.19393939393939394,\n \"acc_stderr\": 0.0308741451365621,\n \"acc_norm\": 0.19393939393939394,\n \"acc_norm_stderr\": 0.0308741451365621\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538808,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.02136202772522272,\n \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.02136202772522272\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823019,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823019\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.02626502460827589,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.02626502460827589\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.28623853211009176,\n \"acc_stderr\": 0.019379436628919965,\n \"acc_norm\": 0.28623853211009176,\n \"acc_norm_stderr\": 0.019379436628919965\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.03256850570293648,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.03256850570293648\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.29596412556053814,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615771,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.029343114798094472,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.029343114798094472\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n \"acc_stderr\": 0.015866243073215054,\n \"acc_norm\": 0.26947637292464877,\n \"acc_norm_stderr\": 0.015866243073215054\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123567,\n \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123567\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.01440029642922563,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.01440029642922563\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341012,\n \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341012\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290413,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290413\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n \"acc_stderr\": 0.01090628261798164,\n \"acc_norm\": 0.23989569752281617,\n \"acc_norm_stderr\": 0.01090628261798164\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2977941176470588,\n \"acc_stderr\": 0.027778298701545443,\n \"acc_norm\": 0.2977941176470588,\n \"acc_norm_stderr\": 0.027778298701545443\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.024789071332007643,\n \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.024789071332007643\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197772,\n \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197772\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.24096385542168675,\n \"acc_stderr\": 0.0332939411907353,\n \"acc_norm\": 0.24096385542168675,\n \"acc_norm_stderr\": 0.0332939411907353\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.030944459778533207,\n \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.030944459778533207\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023496,\n \"mc2\": 0.37081334738032573,\n \"mc2_stderr\": 0.014356461899633393\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5367008681925809,\n \"acc_stderr\": 0.014014578458843262\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \"acc_stderr\": 0.0038289829787357134\n }\n}\n```", "repo_url": "https://huggingface.co/DatPySci/pythia-1b-sft-full", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|arc:challenge|25_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|gsm8k|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hellaswag|10_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T16-10-25.536341.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["**/details_harness|winogrande|5_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T16-10-25.536341.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T16_10_25.536341", "path": ["results_2024-02-13T16-10-25.536341.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T16-10-25.536341.parquet"]}]}]} | 2024-02-13T16:12:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-full
Dataset automatically created during the evaluation run of model DatPySci/pythia-1b-sft-full on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T16:10:25.536341(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-full\n\n\n\nDataset automatically created during the evaluation run of model DatPySci/pythia-1b-sft-full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T16:10:25.536341(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-full\n\n\n\nDataset automatically created during the evaluation run of model DatPySci/pythia-1b-sft-full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T16:10:25.536341(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-full\n\n\n\nDataset automatically created during the evaluation run of model DatPySci/pythia-1b-sft-full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T16:10:25.536341(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
56443239519d5b7d8e413b3c09a7667d5334b2ae | This is the AYA-101 dataset converted to JSON and ready for DPO. | mitkox/aya_dataset | [
"region:us"
] | 2024-02-13T16:39:33+00:00 | {} | 2024-02-13T16:55:24+00:00 | [] | [] | TAGS
#region-us
| This is the AYA-101 dataset converted to JSON and ready for DPO. | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
2c2f3a5fe81dc640a959c2f04724a95c38ae13bd | # Persian Blog
# Dataset Summary
persian_news_dataset is a collection of 400k blog posts. these posts have been gathered from more than 10 websites. This dataset can be used in different NLP tasks like language modeling and text generation tasks.
This effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: `free` and `easy-to-use`. Here is a quick HOW-TO for using this dataset in datasets library:[Demo-datasets](https://saied71.github.io/saied-alimoradi-blog/posts/2021-9-4-demo-datasets.html)
# Description
As discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:
```
text : چرا کودکان به روانشناس نیاز دارند؟ روانشناسی کودکانکودکان همچون غنچههای زیبا هستند که برای شکوفایی و به ثمر رسیدن نیاز به مراقبت و رسیدگی دارند . روانشناس کودک فردیست که از زمان بدو تولد کودک در مراحل مختلف زندگی کودک در کنار والدین وی میباشد و به چگونگی تربیت کودک کمک میکند تا به بهترین شکل رشد کند . چرا که روانشناس کودک با روحیات ، نیازها و مشکلات کودکان و همچنین چگونگی برقراری ارتباط بین کودک و والدین وی آشنایی کامل دارد .بسیاری از کودکان در سنین مختلف بخاطر شرایط زندگی ، دچار انواع ناسازگاریها و مشکلات در زندگی خود میشود از ناسازگاری کودکان میتوان به موارد زیر اشاره کرد : 1 . پرخاشگری 2 . بد دهنی 3 . اختلال در خوابیدن 4 . اختلال در غذا خوردن و کم اشتهایی 5 . حالت افسردگی و اضطراب 6 . ترس از محیط پیرامون 7 . عدم آمادگی برای ورود به جامعه 8 . وجود مشکل در محیط مدرسه 9 . عدم تمرکز 10 . جویدن ناخن ها 11 . انزوا و گوشه گیری 12 . عدم هم بازی شدن با هم سن و سال هاو .این گونه ناسازگاریها در زندگی آینده کودک نقش به سزایی دارد .روانشناس کودکیک روانشناس کودک خوب ، با دلسوزی و با تکیه بر تجربیات و تخصص خود میکوشد تا رفتارهای کودک را مورد ارزیابی و بررسی قرار دهد سپس سعی میکند تا رفتارهای بعدی کودک را پیش بینی کند و منشاء این مشکلات و سطح پیشرفت آن را بیابد. سپس او بهترین روشهای درمان برای بهبود اختلال کودک را مییابد و با کمک والدین این ناسازگاریها ، مشکلات و ناهنجاریها را حل کرده و نهایتا رابطهای دوستانه و صمیمانه بین کودک و والدین وی ایجاد مینماید تاآیندهای درخشان در انتظار کودک شما باشد .
```
# Citation
```
[email protected]
title={persian_blog},
author={Saied Alimoradi},
year={2021}
}
```
| saied/Persian-blog | [
"task_categories:text-generation",
"task_ids:language-modeling",
"source_datasets:original",
"language:fa",
"region:us"
] | 2024-02-13T16:48:07+00:00 | {"language": ["fa"], "source_datasets": ["original"], "task_categories": ["text-generation"], "task_ids": ["language-modeling"], "pretty_name": "persian_blog"} | 2024-02-14T11:09:30+00:00 | [] | [
"fa"
] | TAGS
#task_categories-text-generation #task_ids-language-modeling #source_datasets-original #language-Persian #region-us
| # Persian Blog
# Dataset Summary
persian_news_dataset is a collection of 400k blog posts. these posts have been gathered from more than 10 websites. This dataset can be used in different NLP tasks like language modeling and text generation tasks.
This effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets
# Description
As discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:
| [
"# Persian Blog",
"# Dataset Summary\n\npersian_news_dataset is a collection of 400k blog posts. these posts have been gathered from more than 10 websites. This dataset can be used in different NLP tasks like language modeling and text generation tasks.\n\nThis effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets",
"# Description\n\nAs discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:"
] | [
"TAGS\n#task_categories-text-generation #task_ids-language-modeling #source_datasets-original #language-Persian #region-us \n",
"# Persian Blog",
"# Dataset Summary\n\npersian_news_dataset is a collection of 400k blog posts. these posts have been gathered from more than 10 websites. This dataset can be used in different NLP tasks like language modeling and text generation tasks.\n\nThis effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets",
"# Description\n\nAs discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:"
] | [
40,
4,
123,
39
] | [
"passage: TAGS\n#task_categories-text-generation #task_ids-language-modeling #source_datasets-original #language-Persian #region-us \n# Persian Blog# Dataset Summary\n\npersian_news_dataset is a collection of 400k blog posts. these posts have been gathered from more than 10 websites. This dataset can be used in different NLP tasks like language modeling and text generation tasks.\n\nThis effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets# Description\n\nAs discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:"
] |
10ed678823888e34ce7e1453815e3128d6c94a4f | # Dataset Card for Transcript to SOAP
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
- **Curated by:** [RevMaxx Team](https://revmaxx.co/)
- **Language(s) (NLP):** English
#### Personal and Sensitive Information
The dataset contains sensitive information of patient-doctor conversation.
## Dataset Card Authors
- Team RevMaxx | revmaxx/transcript-soap | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"license:afl-3.0",
"medical",
"region:us"
] | 2024-02-13T17:03:52+00:00 | {"license": "afl-3.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "pretty_name": "soap-generation", "tags": ["medical"]} | 2024-02-17T02:04:10+00:00 | [] | [] | TAGS
#task_categories-text-generation #size_categories-1K<n<10K #license-afl-3.0 #medical #region-us
| # Dataset Card for Transcript to SOAP
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by: RevMaxx Team
- Language(s) (NLP): English
#### Personal and Sensitive Information
The dataset contains sensitive information of patient-doctor conversation.
## Dataset Card Authors
- Team RevMaxx | [
"# Dataset Card for Transcript to SOAP\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n- Curated by: RevMaxx Team\n- Language(s) (NLP): English",
"#### Personal and Sensitive Information\n\nThe dataset contains sensitive information of patient-doctor conversation.",
"## Dataset Card Authors\n\n- Team RevMaxx"
] | [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #license-afl-3.0 #medical #region-us \n",
"# Dataset Card for Transcript to SOAP\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n- Curated by: RevMaxx Team\n- Language(s) (NLP): English",
"#### Personal and Sensitive Information\n\nThe dataset contains sensitive information of patient-doctor conversation.",
"## Dataset Card Authors\n\n- Team RevMaxx"
] | [
40,
36,
4,
24,
22,
11
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #license-afl-3.0 #medical #region-us \n# Dataset Card for Transcript to SOAP\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n- Curated by: RevMaxx Team\n- Language(s) (NLP): English#### Personal and Sensitive Information\n\nThe dataset contains sensitive information of patient-doctor conversation.## Dataset Card Authors\n\n- Team RevMaxx"
] |
50975962715d17478ce00a68d643dc9cb36047f1 |
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 19,793 | 126,842 | 8,710 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
```
| SauravMaheshkar/pareto-cora | [
"task_categories:graph-ml",
"size_categories:1K<n<10K",
"license:cc",
"arxiv:2210.02016",
"region:us"
] | 2024-02-13T17:05:04+00:00 | {"license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["graph-ml"]} | 2024-02-14T11:05:59+00:00 | [
"2210.02016"
] | [] | TAGS
#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us
| Dataset Information
-------------------
Pre-processed as per the official codebase of URL
s
| [] | [
"TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] | [
41
] | [
"passage: TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] |
7ecd35ff8f65a8e599dedf4d0c8f3e09f750197a | # Dataset Card for : Arabic Aya (2A)
<!-- Provide a quick summary of the dataset. -->
<!-- This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).-->
## Dataset Description
**Arabic Aya (2A) : A Curated Subset of the Aya Collection for Arabic Language Processing**
### Overview
`Arabic Aya` is a meticulously curated dataset derived from the comprehensive Aya collection by [CohereForAI](https://huggingface.co/CohereForAI), specifically focusing on Arabic text data. This dataset aggregates content from the [CohereForAI/aya_collection](https://huggingface.co/datasets/CohereForAI/aya_collection), [CohereForAI/aya_dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset), and [CohereForAI/aya_evaluation_suite](https://huggingface.co/datasets/CohereForAI/aya_evaluation_suite), filtering out all but the Arabic content, including both Modern Standard Arabic (MSA) and various regional dialects.
### Purpose
The aim of 'Arabic Aya' is to provide researchers, technologists, and linguists with a ready-to-use Arabic text resource, significantly reducing the time and effort required for data preprocessing in NLP and AI projects focused on the Arabic language.
- Use the Aya datasets out of the box for your Arabic applications and research 😀
### Dataset Sources & Infos
- **Data Origin**: Derived from 69 subsets of the original Aya datasets : [CohereForAI/aya_collection](https://huggingface.co/datasets/CohereForAI/aya_collection), [CohereForAI/aya_dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset), and [CohereForAI/aya_evaluation_suite](https://huggingface.co/datasets/CohereForAI/aya_evaluation_suite).
- **Languages**: Modern Standard Arabic (MSA) and a variety of Arabic dialects ( 'arb', 'arz', 'ary', 'ars', 'knc', 'acm', 'apc', 'aeb', 'ajp', 'acq' )
- **Applications**: Ideal for tasks such as language modeling, text classification, sentiment analysis, dialect identification, and machine translation.
- **Paper:** [2402.06619](https://huggingface.co/papers/2402.06619)
- **Maintainer:** [Elfilali Ali](https://huggingface.co/Ali-C137)
- **License:** Apache-2.0
### Usage
This dataset serves as a foundational tool for those embarking on Arabic language projects, from academic research to commercial applications. By providing a pre-filtered source of Arabic text, 'Arabic Aya' enables users to dive straight into model training, analysis, and application development without the preliminary hassle of data cleaning and language filtering.
#### Use with HF Datasets library
To load this dataset with Datasets, you'll need to install Datasets as `pip install datasets --upgrade` and then use a similar code to the following:
```python
from datasets import load_dataset
dataset = load_dataset("2A2I/Arabic_Aya", "CohereForAI-aya_collection-templated_mintaka")
```
In the above code snippet, "CohereForAI-aya_collection-templated_mintaka" refers to the arabic version (100k rows) of the original "templated_mintaka" subset (780k rows) of the aya_collection. You can load other subsets by specifying its name at the time of loading the dataset.
### Access and Contribution
Available on the Hugging Face Hub under [2A2I/Arabic_Aya](https://huggingface.co/datasets/2A2I/Arabic_Aya), 'Arabic Aya' invites contributions from the community. Users are encouraged to offer feedback, suggest improvements.
### Support and Collaboration
We are committed to fostering an inclusive and supportive environment around Arabic AI and NLP research. For support, collaboration, or queries regarding the dataset, please reach out through the Hugging Face Hub's discussion section or reach out at [2A2I Contact Email]([email protected]).
# Original Dataset Card of Aya by CohereForAI

# Dataset Summary
The Aya Collection is a massive multilingual collection consisting of 513 million instances of prompts and completions covering a wide range of tasks.
This collection incorporates instruction-style templates from fluent speakers and applies them to a curated list of datasets, as well as translations of instruction-style datasets into 101 languages. Aya Dataset, a human-curated multilingual instruction and response dataset, is also part of this collection. See our paper for more details regarding the collection.
- **Curated by:** Contributors of [Aya Open Science Intiative](https://cohere.com/research/aya)
- **Language(s):** 115 languages
- **License:** [Apache 2.0](https://opensource.org/license/apache-2-0)
- **Aya Datasets Family:**
| Name | Explanation |
|------|--------------|
| [aya_dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset) | Human-annotated multilingual instruction finetuning dataset, comprising over 204K instances across 65 languages. |
| [aya_collection](https://huggingface.co/datasets/CohereForAI/aya_collection) | Created by applying instruction-style templates from fluent speakers to 44 datasets, including translations of 19 instruction-style datasets into 101 languages.|
| [aya_evaluation_suite](https://huggingface.co/datasets/CohereForAI/aya_evaluation_suite) | A diverse evaluation set for multilingual open-ended generation, featuring 250 culturally grounded prompts in 7 languages, 200 translated prompts in 24 languages, and human-edited versions selected for cross-cultural relevance from English Dolly in 6 languages.|
# Dataset
The `Aya Collection` is a comprehensive, large corpus of datasets that can be used by researchers around the world to train multilingual models. Our goal is only to include datasets with permissive licensing for manipulation and redistribution.
The `Aya Collection` consists of three different sources of data:
1. Templated data: We collaborated with fluent speakers to create templates that allowed for the automatic expansion of existing datasets into various languages.
2. Translated data: We translated a hand-selected subset of 19 datasets into 101 languages (114 dialects) using the NLLB 3.3B parameter machine translation model.
3. Aya Dataset: We release the [Aya Dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset) as a subset of the overall collection. This is the only dataset in the collection that is human-annotated in its entirety.
## Load with Datasets
To load this dataset with Datasets, you'll need to install Datasets as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
dataset = load_dataset("CohereForAI/aya_collection", "templated_mintaka")
```
In the above code snippet, "templated_mintaka" refers to a subset of the aya_collection. You can load other subsets by specifying its name at the time of loading the dataset.
## Data Instances
An example of a `train` instance looks as follows:
```json
{'id': 246001,
'inputs': 'The following query in English is taken from the geography category. What could be the answer to the question?\nWhat is the seventh tallest mountain in North America?',
'targets': 'The answer is Mount Lucania.',
'dataset_name': 'Mintaka-inst',
'sub_dataset_name': '-',
'task_type': 'question-answering',
'template_id': 3,
'language': 'eng',
'split': 'train',
'script': 'Latn'
}
```
## Data Fields
The data fields are the same among all splits:
- `id:` Unique id of the data point
- `inputs:` Prompt or input to the language model.
- `targets:` Completion or output of the language model.
- `dataset_name:` The name of the source dataset that the data point was taken from
- `sub_dataset_name:` If the source is a collection, this field indicates which part of that collection the data point was taken from. If it is not a collection, this field is left blank.
- `task_type:` The task type that this conversation belongs to.
- `template_id`: The id of the template applied to this data point.
- `language:` The ISO code of the dialect of the conversation.
- `script:` The script of the language.
- `split:` Indicates whether the data point is part of the `train` or the `test` split.
### Statistics
The total number of data points, including the Aya Dataset` is 513,758,189. To view the breakdown of dialect codes and the respective templated and translated data point counts in the Aya Collection , refer to the toggled table below.
<details>
<summary> <b> Breakdown of Aya Collection data point counts grouped by dialects </b> </summary>
|dialect code|language|translated data point count|templated data point count|total count |
|------------|--------|---------------------------|--------------------------|---------------|
|ace |Achinese|8240684 |2000 |8242684 |
|acm |Arabic |4120342 |0 |4120342 |
|acq |Arabic |4120342 |0 |4120342 |
|aeb |Arabic |4120342 |0 |4120342 |
|afr |Afrikaans|4120342 |6108 |4126450 |
|ajp |Arabic |4120342 |0 |4120342 |
|als |Albanian|4120342 |0 |4120342 |
|amh |Amharic |4120342 |25327 |4145669 |
|apc |Arabic |4120342 |0 |4120342 |
|arb |Arabic |6424999 |216430 |6641429 |
|ars |Arabic |4120342 |0 |4120342 |
|ary |Arabic |4120342 |18076 |4138418 |
|arz |Arabic |4120342 |0 |4120342 |
|azb |Azerbaijani|4120342 |0 |4120342 |
|azj |Azerbaijani|4120342 |0 |4120342 |
|bel |Belarusian|4120342 |21273 |4141615 |
|ben |Bengali |4120342 |30661 |4151003 |
|bjn |Banjar |8240684 |2000 |8242684 |
|bul |Bulgarian|4120342 |37722 |4158064 |
|cat |Catalan |4120342 |66900 |4187242 |
|ceb |Cebuano |4120342 |0 |4120342 |
|ces |Czech |4120342 |179604 |4299946 |
|ckb |Kurdish |4120342 |0 |4120342 |
|cym |Welsh |4120342 |0 |4120342 |
|dan |Danish |4120342 |36310 |4156652 |
|deu |German |4120342 |1326722 |5447064 |
|ell |Greek |4120342 |40291 |4160633 |
|eng |English |9771427 |8066678 |17838105 |
|epo |Esperanto|4120342 |0 |4120342 |
|est |Estonian|4120342 |0 |4120342 |
|eus |Basque |4120342 |0 |4120342 |
|fin |Finnish |4120342 |457895 |4578237 |
|fra |French |4120342 |835520 |4955862 |
|gla |Scottish Gaelic|4120342 |0 |4120342 |
|gle |Irish |4120342 |0 |4120342 |
|glg |Galician|4120342 |0 |4120342 |
|guj |Gujarati|4120342 |2157 |4122499 |
|hat |Haitian Creole|4120342 |0 |4120342 |
|hau |Hausa |4120342 |51396 |4171738 |
|heb |Hebrew |4120342 |103466 |4223808 |
|hin |Hindi |4120342 |260387 |4380729 |
|hun |Hungarian|4120342 |82039 |4202381 |
|hye |Armenian|4120342 |7080 |4127422 |
|ibo |Igbo |4120342 |36312 |4156654 |
|ind |Indonesian|4120342 |45709 |4166051 |
|isl |Icelandic|4120342 |0 |4120342 |
|ita |Italian |4120342 |405682 |4526024 |
|jav |Javanese|4120342 |829 |4121171 |
|jpn |Japanese|4120342 |2693177 |6813519 |
|kan |Kannada |4120342 |1156 |4121498 |
|kas |Kashmiri|4120342 |0 |4120342 |
|kat |Georgian|4120342 |0 |4120342 |
|kaz |Kazakh |4120342 |0 |4120342 |
|khk |Mongolian|4120342 |0 |4120342 |
|khm |Khmer |4120342 |0 |4120342 |
|kir |Kyrgyz |4120342 |0 |4120342 |
|kmr |Kurdish |4120342 |0 |4120342 |
|knc |Kanuri |8240684 |0 |8240684 |
|kor |Korean |4120342 |41011 |4161353 |
|lao |Lao |4120342 |0 |4120342 |
|lit |Lithuanian|4120342 |0 |4120342 |
|ltz |Luxembourgish|4120342 |0 |4120342 |
|lvs |Latvian |4120342 |0 |4120342 |
|mal |Malayalam|4120342 |4347 |4124689 |
|mar |Marathi |4120342 |3678 |4124020 |
|min |Minangkabau|6753788 |2000 |6755788 |
|mkd |Macedonian|4120342 |0 |4120342 |
|mlt |Maltese |4120342 |0 |4120342 |
|mni |Manipuri|4120342 |0 |4120342 |
|mri |Maori |4120342 |0 |4120342 |
|mya |Burmese |4120342 |0 |4120342 |
|nld |Dutch |4120342 |220181 |4340523 |
|nno |Norwegian|4120342 |0 |4120342 |
|nob |Norwegian|4120342 |0 |4120342 |
|npi |Nepali |4120342 |0 |4120342 |
|nso |Northern Sotho|4120342 |0 |4120342 |
|pbt |Pashto |4120342 |0 |4120342 |
|pes |Persian |4120342 |245520 |4365862 |
|plt |Malagasy|4120342 |0 |4120342 |
|pol |Polish |4120342 |332503 |4452845 |
|por |Portuguese|4120342 |287432 |4407774 |
|ron |Romanian|4120342 |36359 |4156701 |
|rus |Russian |4120342 |545920 |4666262 |
|sin |Sinhala |4120342 |195 |4120537 |
|slk |Slovak |4120342 |27845 |4148187 |
|slv |Slovenian|4120342 |25731 |4146073 |
|smo |Samoan |4120342 |0 |4120342 |
|sna |Shona |4120342 |3684 |4124026 |
|snd |Sindhi |4120342 |0 |4120342 |
|som |Somali |4120342 |2926 |4123268 |
|sot |Southern Sotho|4120342 |0 |4120342 |
|spa |Spanish |4120342 |379194 |4499536 |
|srp |Serbian |4120342 |77124 |4197466 |
|sun |Sundanese|4120342 |2208 |4122550 |
|swe |Swedish |4120342 |76486 |4196828 |
|swh |Swahili |4120342 |12726 |4133068 |
|tam |Tamil |4120342 |11462 |4131804 |
|taq |Tamasheq|4120342 |0 |4120342 |
|tel |Telugu |4120342 |477821 |4598163 |
|tgk |Tajik |4120342 |0 |4120342 |
|tha |Thai |4120342 |2125180 |6245522 |
|tur |Turkish |4120342 |59932 |4180274 |
|ukr |Ukrainian|4120342 |189384 |4309726 |
|urd |Urdu |4120342 |337739 |4458081 |
|uzn |Uzbek |4120342 |0 |4120342 |
|vie |Vietnamese|4120342 |42232 |4162574 |
|xho |Xhosa |4120342 |2952 |4123294 |
|ydd |Yiddish |4120342 |0 |4120342 |
|yor |Yoruba |4120342 |4907 |4125249 |
|yue |Chinese |4120342 |0 |4120342 |
|zho-Hans |Chinese |4120342 |54528 |4174870 |
|zho-Hant |Chinese |4120342 |0 |4120342 |
|zsm |Malay |4120342 |13950 |4134292 |
|zul |Zulu |4120342 |786 |4121128 |
|arq |Arabic |0 |6046 |6046 |
|ban |Balinese|0 |2000 |2000 |
|bbc |Toba Batak|0 |2000 |2000 |
|bem |Bemba |0 |776 |776 |
|fil |Filipino|0 |220 |220 |
|fon |Fon |0 |845 |845 |
|hrv |Croatian|0 |9007 |9007 |
|kin |Kinyarwanda|0 |11165 |11165 |
|lij |Ligurian|0 |6409 |6409 |
|mad |Madurese|0 |2000 |2000 |
|nij |Ngaju |0 |2000 |2000 |
|nor |Norwegian|0 |72352 |72352 |
|pan |Punjabi |0 |2156 |2156 |
|twi |Twi |0 |10840 |10840 |
|wol |Wolof |0 |785 |785 |
|zho |Chinese |0 |74972 |74972 |
PS: Templated data also includes Mozambican Portuguese, which doesn't have its own ISO language code.
</details>
<br>
# Motivations & Intentions
- **Curation Rationale:** Automatic augmentation of existing datasets serves to enhance the available linguistic resources for multiple languages. The list of languages was initially established from mT5 and aligned with the annotators’ language list and NLLB translation model. The datasets were translated directly from English for all languages.
# Additional Information
## Provenance
- **Methods Used:** A combination of crowd-sourced templating and automatic translation was employed to source this dataset.
- **Methodology Details:**
- *Source:* Existing NLP datasets
- *Dates of Collection:* May 2023 - Dec 2023
## Dataset Version and Maintenance
- **Maintenance Status:** Actively Maintained
- **Version Details:**
- *Current version:* 1.0
- *Last Update:* 02/2024
- *First Release:* 02/2024
## Authorship
- **Publishing Organization:** [Cohere For AI](https://cohere.com/research)
- **Industry Type:** Not-for-profit - Tech
- **Contact Details:** https://cohere.com/research/aya
## Licensing Information
This dataset can be used for any purpose, whether academic or commercial, under the terms of the [Apache 2.0](https://opensource.org/license/apache-2-0) License.
## Citation Information
```bibtex
@misc{singh2024aya,
title={Aya Dataset: An Open-Access Collection for Multilingual Instruction Tuning},
author={Shivalika Singh and Freddie Vargus and Daniel Dsouza and Börje F. Karlsson and Abinaya Mahendiran and Wei-Yin Ko and Herumb Shandilya and Jay Patel and Deividas Mataciunas and Laura OMahony and Mike Zhang and Ramith Hettiarachchi and Joseph Wilson and Marina Machado and Luisa Souza Moura and Dominik Krzemiński and Hakimeh Fadaei and Irem Ergün and Ifeoma Okoh and Aisha Alaagib and Oshan Mudannayake and Zaid Alyafeai and Vu Minh Chien and Sebastian Ruder and Surya Guthikonda and Emad A. Alghamdi and Sebastian Gehrmann and Niklas Muennighoff and Max Bartolo and Julia Kreutzer and Ahmet Üstün and Marzieh Fadaee and Sara Hooker},
year={2024},
eprint={2402.06619},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 2A2I/Arabic_Aya | [
"task_categories:text-classification",
"task_categories:translation",
"task_categories:summarization",
"size_categories:1M<n<10M",
"language:ar",
"license:apache-2.0",
"arxiv:2402.06619",
"region:us"
] | 2024-02-13T17:16:49+00:00 | {"language": ["ar"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-classification", "translation", "summarization"], "pretty_name": "2A", "dataset_info": [{"config_name": "CohereForAI-aya_collection-aya_dataset", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "string"}, {"name": "language_code", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 7555482, "num_examples": 13960}], "download_size": 3687445, "dataset_size": 7555482}, {"config_name": "CohereForAI-aya_collection-aya_human_annotated", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 222650, "num_examples": 250}], "download_size": 120393, "dataset_size": 222650}, {"config_name": "CohereForAI-aya_collection-templated_afrisenti", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 5070578, "num_examples": 14468}, {"name": "test", "num_bytes": 2674428, "num_examples": 7838}, {"name": "validation", "num_bytes": 643036, "num_examples": 1816}], "download_size": 2330165, "dataset_size": 8388042}, {"config_name": "CohereForAI-aya_collection-templated_mintaka", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 20413129, "num_examples": 70000}, {"name": "test", "num_bytes": 5799667, "num_examples": 20000}, {"name": "validation", "num_bytes": 2976183, "num_examples": 10000}], "download_size": 6746433, "dataset_size": 29188979}, {"config_name": "CohereForAI-aya_collection-templated_ntx_llm", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 199809, "num_examples": 111}], "download_size": 34306, "dataset_size": 199809}, {"config_name": "CohereForAI-aya_collection-templated_xcsqa", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "validation", "num_bytes": 393580, "num_examples": 1000}], "download_size": 137233, "dataset_size": 393580}, {"config_name": "CohereForAI-aya_collection-templated_xlel_wd", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 97691354, "num_examples": 90760}, {"name": "test", "num_bytes": 15499274, "num_examples": 14791}, {"name": "validation", "num_bytes": 10752041, "num_examples": 9768}], "download_size": 57959575, "dataset_size": 123942669}, {"config_name": "CohereForAI-aya_collection-translated_adversarial_qa", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 147727007, "num_examples": 100000}, {"name": "test", "num_bytes": 16108000, "num_examples": 10000}, {"name": "validation", "num_bytes": 14862183, "num_examples": 10000}], "download_size": 52642775, "dataset_size": 178697190}, {"config_name": "CohereForAI-aya_collection-translated_dolly", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "gcp_source", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "alphabet", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 213140804, "num_examples": 148080}], "download_size": 96189154, "dataset_size": 213140804}, {"config_name": "CohereForAI-aya_collection-translated_flan_coqa", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 245744048, "num_examples": 64090}], "download_size": 124335769, "dataset_size": 245744048}, {"config_name": "CohereForAI-aya_collection-translated_flan_gem_wiki", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 961863533.277311, "num_examples": 271470}], "download_size": 485152798, "dataset_size": 961863533.277311}, {"config_name": "CohereForAI-aya_collection-translated_flan_qa", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2989244, "num_examples": 5400}], "download_size": 1292664, "dataset_size": 2989244}, {"config_name": "CohereForAI-aya_collection-translated_joke_explaination", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 8219049, "num_examples": 7540}], "download_size": 3600136, "dataset_size": 8219049}, {"config_name": "CohereForAI-aya_collection-translated_mintaka", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 40908047, "num_examples": 140000}, {"name": "test", "num_bytes": 11646781, "num_examples": 40000}, {"name": "validation", "num_bytes": 5951801, "num_examples": 20000}], "download_size": 12723211, "dataset_size": 58506629}, {"config_name": "CohereForAI-aya_collection-translated_mlqa", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 331062576, "num_examples": 231800}, {"name": "validation", "num_bytes": 31900260, "num_examples": 22960}], "download_size": 146571384, "dataset_size": 362962836}, {"config_name": "CohereForAI-aya_collection-translated_nqopen", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 397677612, "num_examples": 1758500}, {"name": "validation", "num_bytes": 16780970, "num_examples": 72200}], "download_size": 136208663, "dataset_size": 414458582}, {"config_name": "CohereForAI-aya_collection-translated_paws", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 303643575, "num_examples": 494010}, {"name": "test", "num_bytes": 49242541, "num_examples": 80000}, {"name": "validation", "num_bytes": 49475307, "num_examples": 80000}], "download_size": 66436419, "dataset_size": 402361423}, {"config_name": "CohereForAI-aya_collection-translated_piqa", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 113290227, "num_examples": 161130}, {"name": "validation", "num_bytes": 12924744, "num_examples": 18380}], "download_size": 45954644, "dataset_size": 126214971}, {"config_name": "CohereForAI-aya_collection-translated_wikiqa", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "sub_dataset_name", "dtype": "string"}, {"name": "task_type", "dtype": "string"}, {"name": "template_id", "dtype": "int64"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 5014300, "num_examples": 10400}, {"name": "test", "num_bytes": 1378807, "num_examples": 2930}, {"name": "validation", "num_bytes": 685770, "num_examples": 1400}], "download_size": 2872586, "dataset_size": 7078877}, {"config_name": "CohereForAI-aya_dataset", "features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "language_code", "dtype": "string"}, {"name": "annotation_type", "dtype": "string"}, {"name": "user_id", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 8314232, "num_examples": 13960}, {"name": "test", "num_bytes": 246400, "num_examples": 250}], "download_size": 3778631, "dataset_size": 8560632}, {"config_name": "CohereForAI-aya_evaluation_suite-aya_human_annotated", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 222650, "num_examples": 250}], "download_size": 120393, "dataset_size": 222650}, {"config_name": "CohereForAI-aya_evaluation_suite-dolly_human_edited", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "source_id", "dtype": "int64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 188495, "num_examples": 200}], "download_size": 100291, "dataset_size": 188495}, {"config_name": "CohereForAI-aya_evaluation_suite-dolly_machine_translated", "features": [{"name": "id", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "script", "dtype": "string"}, {"name": "source_id", "dtype": "int64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 3491803, "num_examples": 2000}], "download_size": 1762303, "dataset_size": 3491803}], "configs": [{"config_name": "CohereForAI-aya_collection-aya_dataset", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-aya_dataset/train-*"}]}, {"config_name": "CohereForAI-aya_collection-aya_human_annotated", "data_files": [{"split": "test", "path": "CohereForAI-aya_collection-aya_human_annotated/test-*"}]}, {"config_name": "CohereForAI-aya_collection-templated_afrisenti", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-templated_afrisenti/train-*"}, {"split": "test", "path": "CohereForAI-aya_collection-templated_afrisenti/test-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-templated_afrisenti/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-templated_mintaka", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-templated_mintaka/train-*"}, {"split": "test", "path": "CohereForAI-aya_collection-templated_mintaka/test-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-templated_mintaka/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-templated_ntx_llm", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-templated_ntx_llm/train-*"}]}, {"config_name": "CohereForAI-aya_collection-templated_xcsqa", "data_files": [{"split": "validation", "path": "CohereForAI-aya_collection-templated_xcsqa/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-templated_xlel_wd", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-templated_xlel_wd/train-*"}, {"split": "test", "path": "CohereForAI-aya_collection-templated_xlel_wd/test-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-templated_xlel_wd/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_adversarial_qa", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_adversarial_qa/train-*"}, {"split": "test", "path": "CohereForAI-aya_collection-translated_adversarial_qa/test-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-translated_adversarial_qa/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_dolly", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_dolly/train-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_flan_coqa", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_flan_coqa/train-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_flan_gem_wiki", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_flan_gem_wiki/train-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_flan_qa", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_flan_qa/train-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_joke_explaination", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_joke_explaination/train-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_mintaka", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_mintaka/train-*"}, {"split": "test", "path": "CohereForAI-aya_collection-translated_mintaka/test-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-translated_mintaka/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_mlqa", "data_files": [{"split": "test", "path": "CohereForAI-aya_collection-translated_mlqa/test-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-translated_mlqa/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_nqopen", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_nqopen/train-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-translated_nqopen/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_paws", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_paws/train-*"}, {"split": "test", "path": "CohereForAI-aya_collection-translated_paws/test-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-translated_paws/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_piqa", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_piqa/train-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-translated_piqa/validation-*"}]}, {"config_name": "CohereForAI-aya_collection-translated_wikiqa", "data_files": [{"split": "train", "path": "CohereForAI-aya_collection-translated_wikiqa/train-*"}, {"split": "test", "path": "CohereForAI-aya_collection-translated_wikiqa/test-*"}, {"split": "validation", "path": "CohereForAI-aya_collection-translated_wikiqa/validation-*"}]}, {"config_name": "CohereForAI-aya_dataset", "data_files": [{"split": "train", "path": "CohereForAI-aya_dataset/train-*"}, {"split": "test", "path": "CohereForAI-aya_dataset/test-*"}]}, {"config_name": "CohereForAI-aya_evaluation_suite-aya_human_annotated", "data_files": [{"split": "test", "path": "CohereForAI-aya_evaluation_suite-aya_human_annotated/test-*"}]}, {"config_name": "CohereForAI-aya_evaluation_suite-dolly_human_edited", "data_files": [{"split": "test", "path": "CohereForAI-aya_evaluation_suite-dolly_human_edited/test-*"}]}, {"config_name": "CohereForAI-aya_evaluation_suite-dolly_machine_translated", "data_files": [{"split": "test", "path": "CohereForAI-aya_evaluation_suite-dolly_machine_translated/test-*"}]}]} | 2024-02-14T23:12:36+00:00 | [
"2402.06619"
] | [
"ar"
] | TAGS
#task_categories-text-classification #task_categories-translation #task_categories-summarization #size_categories-1M<n<10M #language-Arabic #license-apache-2.0 #arxiv-2402.06619 #region-us
| Dataset Card for : Arabic Aya (2A)
==================================
Dataset Description
-------------------
Arabic Aya (2A) : A Curated Subset of the Aya Collection for Arabic Language Processing
### Overview
'Arabic Aya' is a meticulously curated dataset derived from the comprehensive Aya collection by CohereForAI, specifically focusing on Arabic text data. This dataset aggregates content from the CohereForAI/aya\_collection, CohereForAI/aya\_dataset, and CohereForAI/aya\_evaluation\_suite, filtering out all but the Arabic content, including both Modern Standard Arabic (MSA) and various regional dialects.
### Purpose
The aim of 'Arabic Aya' is to provide researchers, technologists, and linguists with a ready-to-use Arabic text resource, significantly reducing the time and effort required for data preprocessing in NLP and AI projects focused on the Arabic language.
* Use the Aya datasets out of the box for your Arabic applications and research
### Dataset Sources & Infos
* Data Origin: Derived from 69 subsets of the original Aya datasets : CohereForAI/aya\_collection, CohereForAI/aya\_dataset, and CohereForAI/aya\_evaluation\_suite.
* Languages: Modern Standard Arabic (MSA) and a variety of Arabic dialects ( 'arb', 'arz', 'ary', 'ars', 'knc', 'acm', 'apc', 'aeb', 'ajp', 'acq' )
* Applications: Ideal for tasks such as language modeling, text classification, sentiment analysis, dialect identification, and machine translation.
* Paper: 2402.06619
* Maintainer: Elfilali Ali
* License: Apache-2.0
### Usage
This dataset serves as a foundational tool for those embarking on Arabic language projects, from academic research to commercial applications. By providing a pre-filtered source of Arabic text, 'Arabic Aya' enables users to dive straight into model training, analysis, and application development without the preliminary hassle of data cleaning and language filtering.
#### Use with HF Datasets library
To load this dataset with Datasets, you'll need to install Datasets as 'pip install datasets --upgrade' and then use a similar code to the following:
In the above code snippet, "CohereForAI-aya\_collection-templated\_mintaka" refers to the arabic version (100k rows) of the original "templated\_mintaka" subset (780k rows) of the aya\_collection. You can load other subsets by specifying its name at the time of loading the dataset.
### Access and Contribution
Available on the Hugging Face Hub under 2A2I/Arabic\_Aya, 'Arabic Aya' invites contributions from the community. Users are encouraged to offer feedback, suggest improvements.
### Support and Collaboration
We are committed to fostering an inclusive and supportive environment around Arabic AI and NLP research. For support, collaboration, or queries regarding the dataset, please reach out through the Hugging Face Hub's discussion section or reach out at 2A2I Contact Email.
Original Dataset Card of Aya by CohereForAI
===========================================
!Aya Header
Dataset Summary
===============
The Aya Collection is a massive multilingual collection consisting of 513 million instances of prompts and completions covering a wide range of tasks.
This collection incorporates instruction-style templates from fluent speakers and applies them to a curated list of datasets, as well as translations of instruction-style datasets into 101 languages. Aya Dataset, a human-curated multilingual instruction and response dataset, is also part of this collection. See our paper for more details regarding the collection.
* Curated by: Contributors of Aya Open Science Intiative
* Language(s): 115 languages
* License: Apache 2.0
* Aya Datasets Family:
| Name | Explanation |
|------|--------------|
| aya\_dataset | Human-annotated multilingual instruction finetuning dataset, comprising over 204K instances across 65 languages. |
| aya\_collection | Created by applying instruction-style templates from fluent speakers to 44 datasets, including translations of 19 instruction-style datasets into 101 languages.|
| aya\_evaluation\_suite | A diverse evaluation set for multilingual open-ended generation, featuring 250 culturally grounded prompts in 7 languages, 200 translated prompts in 24 languages, and human-edited versions selected for cross-cultural relevance from English Dolly in 6 languages.|
Dataset
=======
The 'Aya Collection' is a comprehensive, large corpus of datasets that can be used by researchers around the world to train multilingual models. Our goal is only to include datasets with permissive licensing for manipulation and redistribution.
The 'Aya Collection' consists of three different sources of data:
1. Templated data: We collaborated with fluent speakers to create templates that allowed for the automatic expansion of existing datasets into various languages.
2. Translated data: We translated a hand-selected subset of 19 datasets into 101 languages (114 dialects) using the NLLB 3.3B parameter machine translation model.
3. Aya Dataset: We release the Aya Dataset as a subset of the overall collection. This is the only dataset in the collection that is human-annotated in its entirety.
Load with Datasets
------------------
To load this dataset with Datasets, you'll need to install Datasets as 'pip install datasets --upgrade' and then use the following code:
In the above code snippet, "templated\_mintaka" refers to a subset of the aya\_collection. You can load other subsets by specifying its name at the time of loading the dataset.
Data Instances
--------------
An example of a 'train' instance looks as follows:
Data Fields
-----------
The data fields are the same among all splits:
* 'id:' Unique id of the data point
* 'inputs:' Prompt or input to the language model.
* 'targets:' Completion or output of the language model.
* 'dataset\_name:' The name of the source dataset that the data point was taken from
* 'sub\_dataset\_name:' If the source is a collection, this field indicates which part of that collection the data point was taken from. If it is not a collection, this field is left blank.
* 'task\_type:' The task type that this conversation belongs to.
* 'template\_id': The id of the template applied to this data point.
* 'language:' The ISO code of the dialect of the conversation.
* 'script:' The script of the language.
* 'split:' Indicates whether the data point is part of the 'train' or the 'test' split.
### Statistics
The total number of data points, including the Aya Dataset' is 513,758,189. To view the breakdown of dialect codes and the respective templated and translated data point counts in the Aya Collection , refer to the toggled table below.
**Breakdown of Aya Collection data point counts grouped by dialects**
PS: Templated data also includes Mozambican Portuguese, which doesn't have its own ISO language code.
Motivations & Intentions
========================
* Curation Rationale: Automatic augmentation of existing datasets serves to enhance the available linguistic resources for multiple languages. The list of languages was initially established from mT5 and aligned with the annotators’ language list and NLLB translation model. The datasets were translated directly from English for all languages.
Additional Information
======================
Provenance
----------
* Methods Used: A combination of crowd-sourced templating and automatic translation was employed to source this dataset.
* Methodology Details:
+ *Source:* Existing NLP datasets
+ *Dates of Collection:* May 2023 - Dec 2023
Dataset Version and Maintenance
-------------------------------
* Maintenance Status: Actively Maintained
* Version Details:
+ *Current version:* 1.0
+ *Last Update:* 02/2024
+ *First Release:* 02/2024
Authorship
----------
* Publishing Organization: Cohere For AI
* Industry Type: Not-for-profit - Tech
* Contact Details: URL
Licensing Information
---------------------
This dataset can be used for any purpose, whether academic or commercial, under the terms of the Apache 2.0 License.
| [
"### Overview\n\n\n'Arabic Aya' is a meticulously curated dataset derived from the comprehensive Aya collection by CohereForAI, specifically focusing on Arabic text data. This dataset aggregates content from the CohereForAI/aya\\_collection, CohereForAI/aya\\_dataset, and CohereForAI/aya\\_evaluation\\_suite, filtering out all but the Arabic content, including both Modern Standard Arabic (MSA) and various regional dialects.",
"### Purpose\n\n\nThe aim of 'Arabic Aya' is to provide researchers, technologists, and linguists with a ready-to-use Arabic text resource, significantly reducing the time and effort required for data preprocessing in NLP and AI projects focused on the Arabic language.\n\n\n* Use the Aya datasets out of the box for your Arabic applications and research",
"### Dataset Sources & Infos\n\n\n* Data Origin: Derived from 69 subsets of the original Aya datasets : CohereForAI/aya\\_collection, CohereForAI/aya\\_dataset, and CohereForAI/aya\\_evaluation\\_suite.\n* Languages: Modern Standard Arabic (MSA) and a variety of Arabic dialects ( 'arb', 'arz', 'ary', 'ars', 'knc', 'acm', 'apc', 'aeb', 'ajp', 'acq' )\n* Applications: Ideal for tasks such as language modeling, text classification, sentiment analysis, dialect identification, and machine translation.\n* Paper: 2402.06619\n* Maintainer: Elfilali Ali\n* License: Apache-2.0",
"### Usage\n\n\nThis dataset serves as a foundational tool for those embarking on Arabic language projects, from academic research to commercial applications. By providing a pre-filtered source of Arabic text, 'Arabic Aya' enables users to dive straight into model training, analysis, and application development without the preliminary hassle of data cleaning and language filtering.",
"#### Use with HF Datasets library\n\n\nTo load this dataset with Datasets, you'll need to install Datasets as 'pip install datasets --upgrade' and then use a similar code to the following:\n\n\nIn the above code snippet, \"CohereForAI-aya\\_collection-templated\\_mintaka\" refers to the arabic version (100k rows) of the original \"templated\\_mintaka\" subset (780k rows) of the aya\\_collection. You can load other subsets by specifying its name at the time of loading the dataset.",
"### Access and Contribution\n\n\nAvailable on the Hugging Face Hub under 2A2I/Arabic\\_Aya, 'Arabic Aya' invites contributions from the community. Users are encouraged to offer feedback, suggest improvements.",
"### Support and Collaboration\n\n\nWe are committed to fostering an inclusive and supportive environment around Arabic AI and NLP research. For support, collaboration, or queries regarding the dataset, please reach out through the Hugging Face Hub's discussion section or reach out at 2A2I Contact Email.\n\n\nOriginal Dataset Card of Aya by CohereForAI\n===========================================\n\n\n!Aya Header\n\n\nDataset Summary\n===============\n\n\nThe Aya Collection is a massive multilingual collection consisting of 513 million instances of prompts and completions covering a wide range of tasks.\nThis collection incorporates instruction-style templates from fluent speakers and applies them to a curated list of datasets, as well as translations of instruction-style datasets into 101 languages. Aya Dataset, a human-curated multilingual instruction and response dataset, is also part of this collection. See our paper for more details regarding the collection.\n\n\n* Curated by: Contributors of Aya Open Science Intiative\n* Language(s): 115 languages\n* License: Apache 2.0\n* Aya Datasets Family:\n| Name | Explanation |\n|------|--------------|\n| aya\\_dataset | Human-annotated multilingual instruction finetuning dataset, comprising over 204K instances across 65 languages. |\n| aya\\_collection | Created by applying instruction-style templates from fluent speakers to 44 datasets, including translations of 19 instruction-style datasets into 101 languages.|\n| aya\\_evaluation\\_suite | A diverse evaluation set for multilingual open-ended generation, featuring 250 culturally grounded prompts in 7 languages, 200 translated prompts in 24 languages, and human-edited versions selected for cross-cultural relevance from English Dolly in 6 languages.|\n\n\nDataset\n=======\n\n\nThe 'Aya Collection' is a comprehensive, large corpus of datasets that can be used by researchers around the world to train multilingual models. Our goal is only to include datasets with permissive licensing for manipulation and redistribution.\n\n\nThe 'Aya Collection' consists of three different sources of data:\n\n\n1. Templated data: We collaborated with fluent speakers to create templates that allowed for the automatic expansion of existing datasets into various languages.\n2. Translated data: We translated a hand-selected subset of 19 datasets into 101 languages (114 dialects) using the NLLB 3.3B parameter machine translation model.\n3. Aya Dataset: We release the Aya Dataset as a subset of the overall collection. This is the only dataset in the collection that is human-annotated in its entirety.\n\n\nLoad with Datasets\n------------------\n\n\nTo load this dataset with Datasets, you'll need to install Datasets as 'pip install datasets --upgrade' and then use the following code:\n\n\nIn the above code snippet, \"templated\\_mintaka\" refers to a subset of the aya\\_collection. You can load other subsets by specifying its name at the time of loading the dataset.\n\n\nData Instances\n--------------\n\n\nAn example of a 'train' instance looks as follows:\n\n\nData Fields\n-----------\n\n\nThe data fields are the same among all splits:\n\n\n* 'id:' Unique id of the data point\n* 'inputs:' Prompt or input to the language model.\n* 'targets:' Completion or output of the language model.\n* 'dataset\\_name:' The name of the source dataset that the data point was taken from\n* 'sub\\_dataset\\_name:' If the source is a collection, this field indicates which part of that collection the data point was taken from. If it is not a collection, this field is left blank.\n* 'task\\_type:' The task type that this conversation belongs to.\n* 'template\\_id': The id of the template applied to this data point.\n* 'language:' The ISO code of the dialect of the conversation.\n* 'script:' The script of the language.\n* 'split:' Indicates whether the data point is part of the 'train' or the 'test' split.",
"### Statistics\n\n\nThe total number of data points, including the Aya Dataset' is 513,758,189. To view the breakdown of dialect codes and the respective templated and translated data point counts in the Aya Collection , refer to the toggled table below.\n\n\n\n **Breakdown of Aya Collection data point counts grouped by dialects** \n\nPS: Templated data also includes Mozambican Portuguese, which doesn't have its own ISO language code.\n\n\n\n \n\nMotivations & Intentions\n========================\n\n\n* Curation Rationale: Automatic augmentation of existing datasets serves to enhance the available linguistic resources for multiple languages. The list of languages was initially established from mT5 and aligned with the annotators’ language list and NLLB translation model. The datasets were translated directly from English for all languages.\n\n\nAdditional Information\n======================\n\n\nProvenance\n----------\n\n\n* Methods Used: A combination of crowd-sourced templating and automatic translation was employed to source this dataset.\n* Methodology Details:\n\t+ *Source:* Existing NLP datasets\n\t+ *Dates of Collection:* May 2023 - Dec 2023\n\n\nDataset Version and Maintenance\n-------------------------------\n\n\n* Maintenance Status: Actively Maintained\n* Version Details:\n\t+ *Current version:* 1.0\n\t+ *Last Update:* 02/2024\n\t+ *First Release:* 02/2024\n\n\nAuthorship\n----------\n\n\n* Publishing Organization: Cohere For AI\n* Industry Type: Not-for-profit - Tech\n* Contact Details: URL\n\n\nLicensing Information\n---------------------\n\n\nThis dataset can be used for any purpose, whether academic or commercial, under the terms of the Apache 2.0 License."
] | [
"TAGS\n#task_categories-text-classification #task_categories-translation #task_categories-summarization #size_categories-1M<n<10M #language-Arabic #license-apache-2.0 #arxiv-2402.06619 #region-us \n",
"### Overview\n\n\n'Arabic Aya' is a meticulously curated dataset derived from the comprehensive Aya collection by CohereForAI, specifically focusing on Arabic text data. This dataset aggregates content from the CohereForAI/aya\\_collection, CohereForAI/aya\\_dataset, and CohereForAI/aya\\_evaluation\\_suite, filtering out all but the Arabic content, including both Modern Standard Arabic (MSA) and various regional dialects.",
"### Purpose\n\n\nThe aim of 'Arabic Aya' is to provide researchers, technologists, and linguists with a ready-to-use Arabic text resource, significantly reducing the time and effort required for data preprocessing in NLP and AI projects focused on the Arabic language.\n\n\n* Use the Aya datasets out of the box for your Arabic applications and research",
"### Dataset Sources & Infos\n\n\n* Data Origin: Derived from 69 subsets of the original Aya datasets : CohereForAI/aya\\_collection, CohereForAI/aya\\_dataset, and CohereForAI/aya\\_evaluation\\_suite.\n* Languages: Modern Standard Arabic (MSA) and a variety of Arabic dialects ( 'arb', 'arz', 'ary', 'ars', 'knc', 'acm', 'apc', 'aeb', 'ajp', 'acq' )\n* Applications: Ideal for tasks such as language modeling, text classification, sentiment analysis, dialect identification, and machine translation.\n* Paper: 2402.06619\n* Maintainer: Elfilali Ali\n* License: Apache-2.0",
"### Usage\n\n\nThis dataset serves as a foundational tool for those embarking on Arabic language projects, from academic research to commercial applications. By providing a pre-filtered source of Arabic text, 'Arabic Aya' enables users to dive straight into model training, analysis, and application development without the preliminary hassle of data cleaning and language filtering.",
"#### Use with HF Datasets library\n\n\nTo load this dataset with Datasets, you'll need to install Datasets as 'pip install datasets --upgrade' and then use a similar code to the following:\n\n\nIn the above code snippet, \"CohereForAI-aya\\_collection-templated\\_mintaka\" refers to the arabic version (100k rows) of the original \"templated\\_mintaka\" subset (780k rows) of the aya\\_collection. You can load other subsets by specifying its name at the time of loading the dataset.",
"### Access and Contribution\n\n\nAvailable on the Hugging Face Hub under 2A2I/Arabic\\_Aya, 'Arabic Aya' invites contributions from the community. Users are encouraged to offer feedback, suggest improvements.",
"### Support and Collaboration\n\n\nWe are committed to fostering an inclusive and supportive environment around Arabic AI and NLP research. For support, collaboration, or queries regarding the dataset, please reach out through the Hugging Face Hub's discussion section or reach out at 2A2I Contact Email.\n\n\nOriginal Dataset Card of Aya by CohereForAI\n===========================================\n\n\n!Aya Header\n\n\nDataset Summary\n===============\n\n\nThe Aya Collection is a massive multilingual collection consisting of 513 million instances of prompts and completions covering a wide range of tasks.\nThis collection incorporates instruction-style templates from fluent speakers and applies them to a curated list of datasets, as well as translations of instruction-style datasets into 101 languages. Aya Dataset, a human-curated multilingual instruction and response dataset, is also part of this collection. See our paper for more details regarding the collection.\n\n\n* Curated by: Contributors of Aya Open Science Intiative\n* Language(s): 115 languages\n* License: Apache 2.0\n* Aya Datasets Family:\n| Name | Explanation |\n|------|--------------|\n| aya\\_dataset | Human-annotated multilingual instruction finetuning dataset, comprising over 204K instances across 65 languages. |\n| aya\\_collection | Created by applying instruction-style templates from fluent speakers to 44 datasets, including translations of 19 instruction-style datasets into 101 languages.|\n| aya\\_evaluation\\_suite | A diverse evaluation set for multilingual open-ended generation, featuring 250 culturally grounded prompts in 7 languages, 200 translated prompts in 24 languages, and human-edited versions selected for cross-cultural relevance from English Dolly in 6 languages.|\n\n\nDataset\n=======\n\n\nThe 'Aya Collection' is a comprehensive, large corpus of datasets that can be used by researchers around the world to train multilingual models. Our goal is only to include datasets with permissive licensing for manipulation and redistribution.\n\n\nThe 'Aya Collection' consists of three different sources of data:\n\n\n1. Templated data: We collaborated with fluent speakers to create templates that allowed for the automatic expansion of existing datasets into various languages.\n2. Translated data: We translated a hand-selected subset of 19 datasets into 101 languages (114 dialects) using the NLLB 3.3B parameter machine translation model.\n3. Aya Dataset: We release the Aya Dataset as a subset of the overall collection. This is the only dataset in the collection that is human-annotated in its entirety.\n\n\nLoad with Datasets\n------------------\n\n\nTo load this dataset with Datasets, you'll need to install Datasets as 'pip install datasets --upgrade' and then use the following code:\n\n\nIn the above code snippet, \"templated\\_mintaka\" refers to a subset of the aya\\_collection. You can load other subsets by specifying its name at the time of loading the dataset.\n\n\nData Instances\n--------------\n\n\nAn example of a 'train' instance looks as follows:\n\n\nData Fields\n-----------\n\n\nThe data fields are the same among all splits:\n\n\n* 'id:' Unique id of the data point\n* 'inputs:' Prompt or input to the language model.\n* 'targets:' Completion or output of the language model.\n* 'dataset\\_name:' The name of the source dataset that the data point was taken from\n* 'sub\\_dataset\\_name:' If the source is a collection, this field indicates which part of that collection the data point was taken from. If it is not a collection, this field is left blank.\n* 'task\\_type:' The task type that this conversation belongs to.\n* 'template\\_id': The id of the template applied to this data point.\n* 'language:' The ISO code of the dialect of the conversation.\n* 'script:' The script of the language.\n* 'split:' Indicates whether the data point is part of the 'train' or the 'test' split.",
"### Statistics\n\n\nThe total number of data points, including the Aya Dataset' is 513,758,189. To view the breakdown of dialect codes and the respective templated and translated data point counts in the Aya Collection , refer to the toggled table below.\n\n\n\n **Breakdown of Aya Collection data point counts grouped by dialects** \n\nPS: Templated data also includes Mozambican Portuguese, which doesn't have its own ISO language code.\n\n\n\n \n\nMotivations & Intentions\n========================\n\n\n* Curation Rationale: Automatic augmentation of existing datasets serves to enhance the available linguistic resources for multiple languages. The list of languages was initially established from mT5 and aligned with the annotators’ language list and NLLB translation model. The datasets were translated directly from English for all languages.\n\n\nAdditional Information\n======================\n\n\nProvenance\n----------\n\n\n* Methods Used: A combination of crowd-sourced templating and automatic translation was employed to source this dataset.\n* Methodology Details:\n\t+ *Source:* Existing NLP datasets\n\t+ *Dates of Collection:* May 2023 - Dec 2023\n\n\nDataset Version and Maintenance\n-------------------------------\n\n\n* Maintenance Status: Actively Maintained\n* Version Details:\n\t+ *Current version:* 1.0\n\t+ *Last Update:* 02/2024\n\t+ *First Release:* 02/2024\n\n\nAuthorship\n----------\n\n\n* Publishing Organization: Cohere For AI\n* Industry Type: Not-for-profit - Tech\n* Contact Details: URL\n\n\nLicensing Information\n---------------------\n\n\nThis dataset can be used for any purpose, whether academic or commercial, under the terms of the Apache 2.0 License."
] | [
70,
112,
80,
185,
80,
144,
53,
977,
379
] | [
"passage: TAGS\n#task_categories-text-classification #task_categories-translation #task_categories-summarization #size_categories-1M<n<10M #language-Arabic #license-apache-2.0 #arxiv-2402.06619 #region-us \n### Overview\n\n\n'Arabic Aya' is a meticulously curated dataset derived from the comprehensive Aya collection by CohereForAI, specifically focusing on Arabic text data. This dataset aggregates content from the CohereForAI/aya\\_collection, CohereForAI/aya\\_dataset, and CohereForAI/aya\\_evaluation\\_suite, filtering out all but the Arabic content, including both Modern Standard Arabic (MSA) and various regional dialects.### Purpose\n\n\nThe aim of 'Arabic Aya' is to provide researchers, technologists, and linguists with a ready-to-use Arabic text resource, significantly reducing the time and effort required for data preprocessing in NLP and AI projects focused on the Arabic language.\n\n\n* Use the Aya datasets out of the box for your Arabic applications and research### Dataset Sources & Infos\n\n\n* Data Origin: Derived from 69 subsets of the original Aya datasets : CohereForAI/aya\\_collection, CohereForAI/aya\\_dataset, and CohereForAI/aya\\_evaluation\\_suite.\n* Languages: Modern Standard Arabic (MSA) and a variety of Arabic dialects ( 'arb', 'arz', 'ary', 'ars', 'knc', 'acm', 'apc', 'aeb', 'ajp', 'acq' )\n* Applications: Ideal for tasks such as language modeling, text classification, sentiment analysis, dialect identification, and machine translation.\n* Paper: 2402.06619\n* Maintainer: Elfilali Ali\n* License: Apache-2.0",
"passage: ### Usage\n\n\nThis dataset serves as a foundational tool for those embarking on Arabic language projects, from academic research to commercial applications. By providing a pre-filtered source of Arabic text, 'Arabic Aya' enables users to dive straight into model training, analysis, and application development without the preliminary hassle of data cleaning and language filtering.#### Use with HF Datasets library\n\n\nTo load this dataset with Datasets, you'll need to install Datasets as 'pip install datasets --upgrade' and then use a similar code to the following:\n\n\nIn the above code snippet, \"CohereForAI-aya\\_collection-templated\\_mintaka\" refers to the arabic version (100k rows) of the original \"templated\\_mintaka\" subset (780k rows) of the aya\\_collection. You can load other subsets by specifying its name at the time of loading the dataset.### Access and Contribution\n\n\nAvailable on the Hugging Face Hub under 2A2I/Arabic\\_Aya, 'Arabic Aya' invites contributions from the community. Users are encouraged to offer feedback, suggest improvements."
] |
79dbcc8eddc54ac5ec465c6504ad602188b7a499 | # Persian_News_Dataset
# Dataset Summary
persian_news_dataset is a collection of 5 million news articles. News articles have been gathered from more than 10 news agencies for the last 12 years. This dataset can be used in different NLP tasks like language modeling, classification, supervised topic modeling,...
This effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: `free` and `easy-to-use`. Here is a quick HOW-TO for using this dataset in datasets library:[Demo-datasets](https://saied71.github.io/saied-alimoradi-blog/posts/2021-9-4-demo-datasets.html)
# Description
As discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:
```
text :سهشنبه شب از دور برگشت مرحله نیمهنهایی لیگ قهرمانان اروپا، منچسترسیتی در ورزشگاه «اتحاد» میزبان پاریسنژرمن بود و با ارائه نمایشی حساب شده و تحسین برانگیز به پیروزی دو بر صفر دست یافت.بازی رفت در پاریس با برتری دو بر یک سیتی به اتمام رسیده بود و با این اوصاف تیم تحت هدایت «پپ گواردیولا» در مجموع با پیروزی چهار بر یک، راهی فینال شد.بارش برف موجب سفیدپوش شدن زمین شده بود و همین امر بر عملکرد تیمها تاثیر گذاشت. دیدار در حالی آغاز به کار کرد که «امباپه» ستاره پاریسیها که به تازگی از مصدومیت رهایی پیدا کرده است، نیمکتنشین بود.بازی با حملات میهمان آغاز شد و در دقیقه هفتم داور هلندی با تصمیمی عجیب اعتقاد داشت توپ به دست «زینچنکو» مدافع سیتی برخورد کرده و نقطه پنالتی را نشان داد، اما با استفاده از سیستم کمک داور ویدئویی، پنالتی پس گرفته شد. سیتی خیلی زود به هدفش رسید و در دقیقه ۱۰ حرکت عالی او و پاس به «دیبروین» موجب شد تا توپ در یک رفت و برگشت به «ریاض محرز» رسیده و این بازیکن الجزایری گل نخست بازی را برای میزبان به ارمغان آورد.در دقیقه ۱۶ ضربه سر «مارکینیوش» مدافع پیشتاخته پاریسنژرمن با بدشانسی به تیرک دروازه سیتی برخورد کرد.در ادامه برای دقایقی، بازیکنان در میانه میدان خطاهای متعددی انجام دادند و این امر موجب ایجاد چند درگیری شد.هرچند نماینده فرانسه درپی جبران مافات بود اما برنامهای برای رسیدن به این مهم نداشت تا نیمه نخست با همین یک گل همراه شود.در نیمه دوم هم حملات پاریسیها سودی نداشت و در طرف مقابل منچسترسیتی، بازی بسیار هوشمندانهای ارائه کرد.در دقیقه ۶۲ و در ضد حملهای برق آسا، «فیل فودن» با پاسی عالی توپ را به «ریاض محرز» رساند تا این بازیکن گل دوم خود و تیمش را ثبت کرده و سند صعود سیتی به فینال را امضا کند.در دقیقه ۶۸ «آنخل دیماریا» وینگر آرژانتینی تیم پاریسنژرمن پس از درگیری با «فرناندینو» با کارت قرمز داور از زمین اخراج شد تا کار تیمش تمام شود.در این بازی پاریسنژرمن با تفکرات «پوچتینو»، طراحی حملات خود را به «نیمار» سپرده بود اما این بازیکن مطرح برزیلی با حرکات انفرادی بیش از از اندازه، عملکرد خوبی نداشت و حملات تیمش را خراب کرد.در نهایت بازی با پیروزی سیتی همراه شد و مالکان ثروتمند منچسترسیتی به آرزوی خود رسیده و پس از سالها سرمایهگذاری به دیدار نهایی رسیدند. این اولین حضور سیتی در فینال لیگ قهرمانان اروپا است.چهارشنبه شب در دیگر دیدار دور برگشت نیمهنهایی، چلسی انگلیس در ورزشگاه «استمفورد بریج» شهر لندن پذیرای رئالمادرید اسپانیا است. بازی رفت با تساوی یک بر یک به اتمام رسید
title:آرزوی سیتی برآورده شد؛ صعود شاگردان «گواردیولا» به فینال
category:ورزش
```
# Citation
```
[email protected]
title={persian_news_dataset},
author={Saied Alimoradi},
year={2021}
}
```
| saied/persian_news_dataset | [
"task_categories:text-classification",
"task_categories:text-generation",
"task_ids:language-modeling",
"task_ids:multi-class-classification",
"source_datasets:original",
"language:fa",
"region:us"
] | 2024-02-13T17:17:51+00:00 | {"language": ["fa"], "source_datasets": ["original"], "task_categories": ["text-classification", "text-generation"], "task_ids": ["language-modeling", "multi-class-classification"], "pretty_name": "persian_news_datset"} | 2024-02-14T11:08:07+00:00 | [] | [
"fa"
] | TAGS
#task_categories-text-classification #task_categories-text-generation #task_ids-language-modeling #task_ids-multi-class-classification #source_datasets-original #language-Persian #region-us
| # Persian_News_Dataset
# Dataset Summary
persian_news_dataset is a collection of 5 million news articles. News articles have been gathered from more than 10 news agencies for the last 12 years. This dataset can be used in different NLP tasks like language modeling, classification, supervised topic modeling,...
This effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets
# Description
As discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:
| [
"# Persian_News_Dataset",
"# Dataset Summary\n\npersian_news_dataset is a collection of 5 million news articles. News articles have been gathered from more than 10 news agencies for the last 12 years. This dataset can be used in different NLP tasks like language modeling, classification, supervised topic modeling,...\n\nThis effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets",
"# Description\n\nAs discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:"
] | [
"TAGS\n#task_categories-text-classification #task_categories-text-generation #task_ids-language-modeling #task_ids-multi-class-classification #source_datasets-original #language-Persian #region-us \n",
"# Persian_News_Dataset",
"# Dataset Summary\n\npersian_news_dataset is a collection of 5 million news articles. News articles have been gathered from more than 10 news agencies for the last 12 years. This dataset can be used in different NLP tasks like language modeling, classification, supervised topic modeling,...\n\nThis effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets",
"# Description\n\nAs discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:"
] | [
63,
8,
135,
39
] | [
"passage: TAGS\n#task_categories-text-classification #task_categories-text-generation #task_ids-language-modeling #task_ids-multi-class-classification #source_datasets-original #language-Persian #region-us \n# Persian_News_Dataset# Dataset Summary\n\npersian_news_dataset is a collection of 5 million news articles. News articles have been gathered from more than 10 news agencies for the last 12 years. This dataset can be used in different NLP tasks like language modeling, classification, supervised topic modeling,...\n\nThis effort is part of a bigger perspective to have several datasets in Persian language for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets# Description\n\nAs discussed before, this dataset contains 5M news articles. Each article has these three attributes: text, title, category. Here is a sample of dataset:"
] |
5af03458a70055c92c58d15d573c4add31b749c0 | # Dataset Card for "PocketMarkedDataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Lollitor/PocketMarkedDataset | [
"region:us"
] | 2024-02-13T17:23:08+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "LABEL", "dtype": "float64"}, {"name": "INPUT", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 5646386, "num_examples": 7959}, {"name": "validation", "num_bytes": 622532, "num_examples": 885}], "download_size": 3168627, "dataset_size": 6268918}} | 2024-02-13T17:23:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "PocketMarkedDataset"
More Information needed | [
"# Dataset Card for \"PocketMarkedDataset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"PocketMarkedDataset\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"PocketMarkedDataset\"\n\nMore Information needed"
] |
68e21a54e8eaaad099de0c93df5b078ac319324d | # Dataset Card for "CASFPocketMarked"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Lollitor/CASFPocketMarked | [
"region:us"
] | 2024-02-13T17:27:34+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "INPUT", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 283686, "num_examples": 285}], "download_size": 100036, "dataset_size": 283686}} | 2024-02-13T17:27:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "CASFPocketMarked"
More Information needed | [
"# Dataset Card for \"CASFPocketMarked\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"CASFPocketMarked\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"CASFPocketMarked\"\n\nMore Information needed"
] |
5e16accac9ebb000a14abadaeb466027e96d3c22 | # Dataset Card for "FSPocketMarked"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Lollitor/FSPocketMarked | [
"region:us"
] | 2024-02-13T17:30:39+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "INPUT", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 16932705, "num_examples": 16245}], "download_size": 253360, "dataset_size": 16932705}} | 2024-02-13T17:30:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "FSPocketMarked"
More Information needed | [
"# Dataset Card for \"FSPocketMarked\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"FSPocketMarked\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"FSPocketMarked\"\n\nMore Information needed"
] |
6c0c7e490ddbfb29cc99f4078cc572a567768644 |
# GeoEDdA: A Gold Standard Dataset for Geo-semantic Annotation of Diderot & d’Alembert’s Encyclopédie
## Dataset Description
<!-- Provide a longer summary of what this model is. -->
- **Authors:** [Ludovic Moncla](https://ludovicmoncla.github.io), [Katherine McDonough](https://www.lancaster.ac.uk/dsi/about-us/members/katherine-mcdonough#projects) and [Denis Vigier](http://www.icar.cnrs.fr/membre/dvigier/) in the framework of the [GEODE](https://geode-project.github.io) project.
- **Data source:** [ARTFL Encyclopédie Project](https://artfl-project.uchicago.edu), University of Chicago
- **Github repository:** [https://github.com/GEODE-project/ner-spancat-edda](https://github.com/GEODE-project/ner-spancat-edda)
- **Language:** French
- **License:** cc-by-nc-4.0
- **Zenodo repository:** https://zenodo.org/records/10530178
### Dataset Summary
This dataset contains semantic annotations (at the token and span levels) for named entities (such as Spatial, Person, and MISC), nominal entities, as well as nested named entities, spatial relations, and other relevant information within French encyclopedic entries.
The span tagset is as follows:
- **NC-Spatial**: a common noun that identifies a spatial entity (nominal spatial entity) including natural features, e.g. `ville`, `la rivière`, `royaume`.
- **NP-Spatial**: a proper noun identifying the name of a place (spatial named entities), e.g. `France`, `Paris`, `la Chine`.
- **ENE-Spatial**: nested spatial entity , e.g. `ville de France`, `royaume de Naples`, `la mer Baltique`.
- **Relation**: spatial relation, e.g. `dans`, `sur`, `à 10 lieues de`.
- **Latlong**: geographic coordinates, e.g. `Long. 19. 49. lat. 43. 55. 44.`
- **NC-Person**: a common noun that identifies a person (nominal spatial entity), e.g. `roi`, `l'empereur`, `les auteurs`.
- **NP-Person**: a proper noun identifying the name of a person (person named entities), e.g. `Louis XIV`, `Pline`, `les Romains`.
- **ENE-Person**: nested people entity, e.g. `le czar Pierre`, `roi de Macédoine`
- **NP-Misc**: a proper noun identifying entities not classified as spatial or person, e.g. `l'Eglise`, `1702`, `Pélasgique`.
- **ENE-Misc**: nested named entity not classified as spatial or person, e.g. `l'ordre de S. Jacques`, `la déclaration du 21 Mars 1671`.
- **Head**: entry name
- **Domain-Mark**: words indicating the knowledge domain (usually after the head and between parenthesis), e.g. `Géographie`, `Geog.`, `en Anatomie`.
### Supported Tasks
- `token-classification` or `span-classification`: The dataset can be used to train a model for `token-classification` or `span-classification`.
It is more specifically designed for spatial role labelling. A spacy custom spancat model is available at: https://huggingface.co/GEODE/fr_spacy_custom_spancat_edda.
## Dataset Structure
The dataset is provided as JSONL files[^1] where each row follows the following structure:
```
{
"text": " ILLESCAS, (Géog.) petite ville d'Espagne <...> ",
"meta": {"volume": 8, "head": "ILLESCAS", "author": "unsigned", "domain_article": "Géographie", "domain_paragraph": "Géographie", "article": 2637, "paragraph": 1},
"tokens": [{"text": " ", "start": 0, "end": 1, "id": 0, "ws": false}, {"text": "ILLESCAS", "start": 1, "end": 9, "id": 1, "ws": false}, {"text": ",", "start": 9, "end": 10, "id": 2, "ws": true}, {"text": "(", "start": 11, "end": 12, "id": 3, "ws": false}, {"text": "Géog", "start": 12, "end": 16, "id": 4, "ws": false}, {"text": ".", "start": 16, "end": 17, "id": 5, "ws": false}, {"text": ")", "start": 17, "end": 18, "id": 6, "ws": true}, {"text": "petite", "start": 19, "end": 25, "id": 7, "ws": true}, {"text": "ville", "start": 26, "end": 31, "id": 8, "ws": true}, {"text": "d'", "start": 32, "end": 34, "id": 9, "ws": false}, {"text": "Espagne", "start": 34, "end": 41, "id": 10, "ws": false}, {"text": ",", "start": 41, "end": 42, "id": 11, "ws": true} <...>],
"spans": [{"text": "ILLESCAS", "start": 1, "end": 9, "token_start": 1, "token_end": 1, "label": "Head"}, {"text": "Géog.", "start": 12, "end": 17, "token_start": 4, "token_end": 5, "label": "Domain-mark"}, {"text": "petite ville", "start": 19, "end": 31, "token_start": 7, "token_end": 8, "label": "NC-Spatial"}, {"text": "petite ville d'Espagne", "start": 19, "end": 41, "token_start": 7, "token_end": 10, "label": "ENE-Spatial"}, {"text": "Espagne", "start": 34, "end": 41, "token_start": 10, "token_end": 10, "label": "NP-Spatial"}, <...>]
}
```
Each data contains four main fields:
- `text`: plain text of a paragraph.
- `meta`: metadata from the ARTFL Encyclopédie about the paragraph such volume, article, paragraph id, headword, etc.
- `tokens`: list of tokens, with their text, id, start and end position at the character level.
- `spans`: list of spans (i.e., annotations), with their text, label, start and end position at the character level.
[^1]:spaCy binary files are also available on the [Github](https://github.com/GEODE-project/ner-spancat-edda) and [Zenodo](https://zenodo.org/records/10530178) repositories.
### Data Splits
The dataset consists of 2200 paragraphs randomly selected out of 2001 Encyclopédie's entries.
All paragraphs were written in French and are distributed as follows among the Encyclopédie knowledge domains:
| Knowledge domain | Paragraphs |
|---|:---:|
| Géographie | 1096 |
| Histoire | 259 |
| Droit Jurisprudence | 113 |
| Physique | 92 |
| Métiers | 92 |
| Médecine | 88 |
| Philosophie | 69 |
| Histoire naturelle | 65 |
| Belles-lettres | 65 |
| Militaire | 62 |
| Commerce | 48 |
| Beaux-arts | 44 |
| Agriculture | 36 |
| Chasse | 31 |
| Religion | 23 |
| Musique | 17 |
The spans/entities were labeled by the project team along with using pre-labelling with early models to speed up the labelling process.
A train/val/test split was used.
Validation and test sets are composed of 200 paragraphs each: 100 classified as 'Géographie' and 100 from another knowledge domain.
The datasets have the following breakdown of tokens and spans/entities.
| | Train | Validation | Test|
|---|:---:|:---:|:---:|
|Paragraphs| 1,800 | 200 | 200|
| Tokens | 134,254 | 15,167 | 14,079 |
| NC-Spatial | 3,268 | 358 | 357 |
| NP-Spatial | 4,719 | 464 | 522 |
| ENE-Spatial | 3,044 | 326 | 334 |
| Relation | 2,101 | 220 | 226 |
| Latlong | 553 | 66 | 72 |
| NC-Person | 1,378 | 132 | 133 |
| NP-Person | 1,603 | 170 | 150 |
| ENE-Person | 491 | 49 | 57 |
| NP-Misc | 953 | 108 | 96 |
| ENE-Misc | 255 | 31 | 22 |
| Head | 1,264 | 143 | 154 |
| Domain-Mark | 1,069 | 122 | 133 |
## Additional Information
### Dataset Curators
List of people involved in annotating the dataset:
* [Ludovic Moncla](https://ludovicmoncla.github.io) ([@ludovicmoncla](https://github.com/ludovicmoncla)), INSA Lyon, CNRS, LIRIS UMR 5205
* [Katherine McDonough](https://www.lancaster.ac.uk/dsi/about-us/members/katherine-mcdonough#projects) ([@kmcdono2](https://github.com/kmcdono2), Lancaster University & The Alan Turing Institute
### Acknowledgement
The authors are grateful to the [ASLAN project](https://aslan.universite-lyon.fr) (ANR-10-LABX-0081) of the Université de Lyon, for its financial support within the French program "Investments for the Future" operated by the National Research Agency (ANR).
Data courtesy the [ARTFL Encyclopédie Project](https://artfl-project.uchicago.edu), University of Chicago.
| GEODE/GeoEDdA | [
"task_categories:token-classification",
"language:fr",
"license:cc-by-nc-4.0",
"spacy",
"region:us"
] | 2024-02-13T17:31:05+00:00 | {"language": ["fr"], "license": "cc-by-nc-4.0", "task_categories": ["token-classification"], "pretty_name": "GeoEDdA", "tags": ["spacy"]} | 2024-02-17T06:20:56+00:00 | [] | [
"fr"
] | TAGS
#task_categories-token-classification #language-French #license-cc-by-nc-4.0 #spacy #region-us
| GeoEDdA: A Gold Standard Dataset for Geo-semantic Annotation of Diderot & d’Alembert’s Encyclopédie
===================================================================================================
Dataset Description
-------------------
* Authors: Ludovic Moncla, Katherine McDonough and Denis Vigier in the framework of the GEODE project.
* Data source: ARTFL Encyclopédie Project, University of Chicago
* Github repository: URL
* Language: French
* License: cc-by-nc-4.0
* Zenodo repository: URL
### Dataset Summary
This dataset contains semantic annotations (at the token and span levels) for named entities (such as Spatial, Person, and MISC), nominal entities, as well as nested named entities, spatial relations, and other relevant information within French encyclopedic entries.
The span tagset is as follows:
* NC-Spatial: a common noun that identifies a spatial entity (nominal spatial entity) including natural features, e.g. 'ville', 'la rivière', 'royaume'.
* NP-Spatial: a proper noun identifying the name of a place (spatial named entities), e.g. 'France', 'Paris', 'la Chine'.
* ENE-Spatial: nested spatial entity , e.g. 'ville de France', 'royaume de Naples', 'la mer Baltique'.
* Relation: spatial relation, e.g. 'dans', 'sur', 'à 10 lieues de'.
* Latlong: geographic coordinates, e.g. 'Long. 19. 49. lat. 43. 55. 44.'
* NC-Person: a common noun that identifies a person (nominal spatial entity), e.g. 'roi', 'l'empereur', 'les auteurs'.
* NP-Person: a proper noun identifying the name of a person (person named entities), e.g. 'Louis XIV', 'Pline', 'les Romains'.
* ENE-Person: nested people entity, e.g. 'le czar Pierre', 'roi de Macédoine'
* NP-Misc: a proper noun identifying entities not classified as spatial or person, e.g. 'l'Eglise', '1702', 'Pélasgique'.
* ENE-Misc: nested named entity not classified as spatial or person, e.g. 'l'ordre de S. Jacques', 'la déclaration du 21 Mars 1671'.
* Head: entry name
* Domain-Mark: words indicating the knowledge domain (usually after the head and between parenthesis), e.g. 'Géographie', 'Geog.', 'en Anatomie'.
### Supported Tasks
* 'token-classification' or 'span-classification': The dataset can be used to train a model for 'token-classification' or 'span-classification'.
It is more specifically designed for spatial role labelling. A spacy custom spancat model is available at: URL
Dataset Structure
-----------------
The dataset is provided as JSONL files[^1] where each row follows the following structure:
Each data contains four main fields:
* 'text': plain text of a paragraph.
* 'meta': metadata from the ARTFL Encyclopédie about the paragraph such volume, article, paragraph id, headword, etc.
* 'tokens': list of tokens, with their text, id, start and end position at the character level.
* 'spans': list of spans (i.e., annotations), with their text, label, start and end position at the character level.
[^1]:spaCy binary files are also available on the Github and Zenodo repositories.
### Data Splits
The dataset consists of 2200 paragraphs randomly selected out of 2001 Encyclopédie's entries.
All paragraphs were written in French and are distributed as follows among the Encyclopédie knowledge domains:
The spans/entities were labeled by the project team along with using pre-labelling with early models to speed up the labelling process.
A train/val/test split was used.
Validation and test sets are composed of 200 paragraphs each: 100 classified as 'Géographie' and 100 from another knowledge domain.
The datasets have the following breakdown of tokens and spans/entities.
Additional Information
----------------------
### Dataset Curators
List of people involved in annotating the dataset:
* Ludovic Moncla (@ludovicmoncla), INSA Lyon, CNRS, LIRIS UMR 5205
* Katherine McDonough (@kmcdono2, Lancaster University & The Alan Turing Institute
### Acknowledgement
The authors are grateful to the ASLAN project (ANR-10-LABX-0081) of the Université de Lyon, for its financial support within the French program "Investments for the Future" operated by the National Research Agency (ANR).
Data courtesy the ARTFL Encyclopédie Project, University of Chicago.
| [
"### Dataset Summary\n\n\nThis dataset contains semantic annotations (at the token and span levels) for named entities (such as Spatial, Person, and MISC), nominal entities, as well as nested named entities, spatial relations, and other relevant information within French encyclopedic entries.\n\n\nThe span tagset is as follows:\n\n\n* NC-Spatial: a common noun that identifies a spatial entity (nominal spatial entity) including natural features, e.g. 'ville', 'la rivière', 'royaume'.\n* NP-Spatial: a proper noun identifying the name of a place (spatial named entities), e.g. 'France', 'Paris', 'la Chine'.\n* ENE-Spatial: nested spatial entity , e.g. 'ville de France', 'royaume de Naples', 'la mer Baltique'.\n* Relation: spatial relation, e.g. 'dans', 'sur', 'à 10 lieues de'.\n* Latlong: geographic coordinates, e.g. 'Long. 19. 49. lat. 43. 55. 44.'\n* NC-Person: a common noun that identifies a person (nominal spatial entity), e.g. 'roi', 'l'empereur', 'les auteurs'.\n* NP-Person: a proper noun identifying the name of a person (person named entities), e.g. 'Louis XIV', 'Pline', 'les Romains'.\n* ENE-Person: nested people entity, e.g. 'le czar Pierre', 'roi de Macédoine'\n* NP-Misc: a proper noun identifying entities not classified as spatial or person, e.g. 'l'Eglise', '1702', 'Pélasgique'.\n* ENE-Misc: nested named entity not classified as spatial or person, e.g. 'l'ordre de S. Jacques', 'la déclaration du 21 Mars 1671'.\n* Head: entry name\n* Domain-Mark: words indicating the knowledge domain (usually after the head and between parenthesis), e.g. 'Géographie', 'Geog.', 'en Anatomie'.",
"### Supported Tasks\n\n\n* 'token-classification' or 'span-classification': The dataset can be used to train a model for 'token-classification' or 'span-classification'.\nIt is more specifically designed for spatial role labelling. A spacy custom spancat model is available at: URL\n\n\nDataset Structure\n-----------------\n\n\nThe dataset is provided as JSONL files[^1] where each row follows the following structure:\n\n\nEach data contains four main fields:\n\n\n* 'text': plain text of a paragraph.\n* 'meta': metadata from the ARTFL Encyclopédie about the paragraph such volume, article, paragraph id, headword, etc.\n* 'tokens': list of tokens, with their text, id, start and end position at the character level.\n* 'spans': list of spans (i.e., annotations), with their text, label, start and end position at the character level.\n\n\n[^1]:spaCy binary files are also available on the Github and Zenodo repositories.",
"### Data Splits\n\n\nThe dataset consists of 2200 paragraphs randomly selected out of 2001 Encyclopédie's entries.\nAll paragraphs were written in French and are distributed as follows among the Encyclopédie knowledge domains:\n\n\n\nThe spans/entities were labeled by the project team along with using pre-labelling with early models to speed up the labelling process.\nA train/val/test split was used.\nValidation and test sets are composed of 200 paragraphs each: 100 classified as 'Géographie' and 100 from another knowledge domain.\nThe datasets have the following breakdown of tokens and spans/entities.\n\n\n\nAdditional Information\n----------------------",
"### Dataset Curators\n\n\nList of people involved in annotating the dataset:\n\n\n* Ludovic Moncla (@ludovicmoncla), INSA Lyon, CNRS, LIRIS UMR 5205\n* Katherine McDonough (@kmcdono2, Lancaster University & The Alan Turing Institute",
"### Acknowledgement\n\n\nThe authors are grateful to the ASLAN project (ANR-10-LABX-0081) of the Université de Lyon, for its financial support within the French program \"Investments for the Future\" operated by the National Research Agency (ANR).\nData courtesy the ARTFL Encyclopédie Project, University of Chicago."
] | [
"TAGS\n#task_categories-token-classification #language-French #license-cc-by-nc-4.0 #spacy #region-us \n",
"### Dataset Summary\n\n\nThis dataset contains semantic annotations (at the token and span levels) for named entities (such as Spatial, Person, and MISC), nominal entities, as well as nested named entities, spatial relations, and other relevant information within French encyclopedic entries.\n\n\nThe span tagset is as follows:\n\n\n* NC-Spatial: a common noun that identifies a spatial entity (nominal spatial entity) including natural features, e.g. 'ville', 'la rivière', 'royaume'.\n* NP-Spatial: a proper noun identifying the name of a place (spatial named entities), e.g. 'France', 'Paris', 'la Chine'.\n* ENE-Spatial: nested spatial entity , e.g. 'ville de France', 'royaume de Naples', 'la mer Baltique'.\n* Relation: spatial relation, e.g. 'dans', 'sur', 'à 10 lieues de'.\n* Latlong: geographic coordinates, e.g. 'Long. 19. 49. lat. 43. 55. 44.'\n* NC-Person: a common noun that identifies a person (nominal spatial entity), e.g. 'roi', 'l'empereur', 'les auteurs'.\n* NP-Person: a proper noun identifying the name of a person (person named entities), e.g. 'Louis XIV', 'Pline', 'les Romains'.\n* ENE-Person: nested people entity, e.g. 'le czar Pierre', 'roi de Macédoine'\n* NP-Misc: a proper noun identifying entities not classified as spatial or person, e.g. 'l'Eglise', '1702', 'Pélasgique'.\n* ENE-Misc: nested named entity not classified as spatial or person, e.g. 'l'ordre de S. Jacques', 'la déclaration du 21 Mars 1671'.\n* Head: entry name\n* Domain-Mark: words indicating the knowledge domain (usually after the head and between parenthesis), e.g. 'Géographie', 'Geog.', 'en Anatomie'.",
"### Supported Tasks\n\n\n* 'token-classification' or 'span-classification': The dataset can be used to train a model for 'token-classification' or 'span-classification'.\nIt is more specifically designed for spatial role labelling. A spacy custom spancat model is available at: URL\n\n\nDataset Structure\n-----------------\n\n\nThe dataset is provided as JSONL files[^1] where each row follows the following structure:\n\n\nEach data contains four main fields:\n\n\n* 'text': plain text of a paragraph.\n* 'meta': metadata from the ARTFL Encyclopédie about the paragraph such volume, article, paragraph id, headword, etc.\n* 'tokens': list of tokens, with their text, id, start and end position at the character level.\n* 'spans': list of spans (i.e., annotations), with their text, label, start and end position at the character level.\n\n\n[^1]:spaCy binary files are also available on the Github and Zenodo repositories.",
"### Data Splits\n\n\nThe dataset consists of 2200 paragraphs randomly selected out of 2001 Encyclopédie's entries.\nAll paragraphs were written in French and are distributed as follows among the Encyclopédie knowledge domains:\n\n\n\nThe spans/entities were labeled by the project team along with using pre-labelling with early models to speed up the labelling process.\nA train/val/test split was used.\nValidation and test sets are composed of 200 paragraphs each: 100 classified as 'Géographie' and 100 from another knowledge domain.\nThe datasets have the following breakdown of tokens and spans/entities.\n\n\n\nAdditional Information\n----------------------",
"### Dataset Curators\n\n\nList of people involved in annotating the dataset:\n\n\n* Ludovic Moncla (@ludovicmoncla), INSA Lyon, CNRS, LIRIS UMR 5205\n* Katherine McDonough (@kmcdono2, Lancaster University & The Alan Turing Institute",
"### Acknowledgement\n\n\nThe authors are grateful to the ASLAN project (ANR-10-LABX-0081) of the Université de Lyon, for its financial support within the French program \"Investments for the Future\" operated by the National Research Agency (ANR).\nData courtesy the ARTFL Encyclopédie Project, University of Chicago."
] | [
38,
553,
240,
152,
62,
75
] | [
"passage: TAGS\n#task_categories-token-classification #language-French #license-cc-by-nc-4.0 #spacy #region-us \n",
"passage: ### Dataset Summary\n\n\nThis dataset contains semantic annotations (at the token and span levels) for named entities (such as Spatial, Person, and MISC), nominal entities, as well as nested named entities, spatial relations, and other relevant information within French encyclopedic entries.\n\n\nThe span tagset is as follows:\n\n\n* NC-Spatial: a common noun that identifies a spatial entity (nominal spatial entity) including natural features, e.g. 'ville', 'la rivière', 'royaume'.\n* NP-Spatial: a proper noun identifying the name of a place (spatial named entities), e.g. 'France', 'Paris', 'la Chine'.\n* ENE-Spatial: nested spatial entity , e.g. 'ville de France', 'royaume de Naples', 'la mer Baltique'.\n* Relation: spatial relation, e.g. 'dans', 'sur', 'à 10 lieues de'.\n* Latlong: geographic coordinates, e.g. 'Long. 19. 49. lat. 43. 55. 44.'\n* NC-Person: a common noun that identifies a person (nominal spatial entity), e.g. 'roi', 'l'empereur', 'les auteurs'.\n* NP-Person: a proper noun identifying the name of a person (person named entities), e.g. 'Louis XIV', 'Pline', 'les Romains'.\n* ENE-Person: nested people entity, e.g. 'le czar Pierre', 'roi de Macédoine'\n* NP-Misc: a proper noun identifying entities not classified as spatial or person, e.g. 'l'Eglise', '1702', 'Pélasgique'.\n* ENE-Misc: nested named entity not classified as spatial or person, e.g. 'l'ordre de S. Jacques', 'la déclaration du 21 Mars 1671'.\n* Head: entry name\n* Domain-Mark: words indicating the knowledge domain (usually after the head and between parenthesis), e.g. 'Géographie', 'Geog.', 'en Anatomie'.### Supported Tasks\n\n\n* 'token-classification' or 'span-classification': The dataset can be used to train a model for 'token-classification' or 'span-classification'.\nIt is more specifically designed for spatial role labelling. A spacy custom spancat model is available at: URL\n\n\nDataset Structure\n-----------------\n\n\nThe dataset is provided as JSONL files[^1] where each row follows the following structure:\n\n\nEach data contains four main fields:\n\n\n* 'text': plain text of a paragraph.\n* 'meta': metadata from the ARTFL Encyclopédie about the paragraph such volume, article, paragraph id, headword, etc.\n* 'tokens': list of tokens, with their text, id, start and end position at the character level.\n* 'spans': list of spans (i.e., annotations), with their text, label, start and end position at the character level.\n\n\n[^1]:spaCy binary files are also available on the Github and Zenodo repositories.### Data Splits\n\n\nThe dataset consists of 2200 paragraphs randomly selected out of 2001 Encyclopédie's entries.\nAll paragraphs were written in French and are distributed as follows among the Encyclopédie knowledge domains:\n\n\n\nThe spans/entities were labeled by the project team along with using pre-labelling with early models to speed up the labelling process.\nA train/val/test split was used.\nValidation and test sets are composed of 200 paragraphs each: 100 classified as 'Géographie' and 100 from another knowledge domain.\nThe datasets have the following breakdown of tokens and spans/entities.\n\n\n\nAdditional Information\n----------------------### Dataset Curators\n\n\nList of people involved in annotating the dataset:\n\n\n* Ludovic Moncla (@ludovicmoncla), INSA Lyon, CNRS, LIRIS UMR 5205\n* Katherine McDonough (@kmcdono2, Lancaster University & The Alan Turing Institute"
] |
043f6cea9047e32ee11666ce14f6762d40669c33 |
# Dataset Card for Evaluation run of DreadPoor/ToppyLake-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/ToppyLake-7B-slerp](https://huggingface.co/DreadPoor/ToppyLake-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T17:37:50.313114](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp/blob/main/results_2024-02-13T17-37-50.313114.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527643322107648,
"acc_stderr": 0.03209219499038095,
"acc_norm": 0.65299648074574,
"acc_norm_stderr": 0.032754010922687565,
"mc1": 0.4700122399020808,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6253804109646736,
"mc2_stderr": 0.01533242283561269
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.6919795221843004,
"acc_norm_stderr": 0.013491429517292038
},
"harness|hellaswag|10": {
"acc": 0.6954789882493527,
"acc_stderr": 0.004592637369905785,
"acc_norm": 0.8698466440948018,
"acc_norm_stderr": 0.0033578442491239546
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.02355964698318994,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.02355964698318994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553332,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553332
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323788,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323788
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3888268156424581,
"acc_stderr": 0.016303899530796123,
"acc_norm": 0.3888268156424581,
"acc_norm_stderr": 0.016303899530796123
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.01272978538659856,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.01272978538659856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4700122399020808,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6253804109646736,
"mc2_stderr": 0.01533242283561269
},
"harness|winogrande|5": {
"acc": 0.8279400157853196,
"acc_stderr": 0.010607731615247012
},
"harness|gsm8k|5": {
"acc": 0.6595905989385898,
"acc_stderr": 0.013052097103299102
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp | [
"region:us"
] | 2024-02-13T17:40:13+00:00 | {"pretty_name": "Evaluation run of DreadPoor/ToppyLake-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [DreadPoor/ToppyLake-7B-slerp](https://huggingface.co/DreadPoor/ToppyLake-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T17:37:50.313114](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ToppyLake-7B-slerp/blob/main/results_2024-02-13T17-37-50.313114.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527643322107648,\n \"acc_stderr\": 0.03209219499038095,\n \"acc_norm\": 0.65299648074574,\n \"acc_norm_stderr\": 0.032754010922687565,\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6253804109646736,\n \"mc2_stderr\": 0.01533242283561269\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n \"acc_norm\": 0.6919795221843004,\n \"acc_norm_stderr\": 0.013491429517292038\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6954789882493527,\n \"acc_stderr\": 0.004592637369905785,\n \"acc_norm\": 0.8698466440948018,\n \"acc_norm_stderr\": 0.0033578442491239546\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318994,\n \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553332,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553332\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323788,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323788\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3888268156424581,\n \"acc_stderr\": 0.016303899530796123,\n \"acc_norm\": 0.3888268156424581,\n \"acc_norm_stderr\": 0.016303899530796123\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.01272978538659856,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.01272978538659856\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6253804109646736,\n \"mc2_stderr\": 0.01533242283561269\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247012\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6595905989385898,\n \"acc_stderr\": 0.013052097103299102\n }\n}\n```", "repo_url": "https://huggingface.co/DreadPoor/ToppyLake-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|arc:challenge|25_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|gsm8k|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hellaswag|10_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T17-37-50.313114.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["**/details_harness|winogrande|5_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T17-37-50.313114.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T17_37_50.313114", "path": ["results_2024-02-13T17-37-50.313114.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T17-37-50.313114.parquet"]}]}]} | 2024-02-13T17:40:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DreadPoor/ToppyLake-7B-slerp
Dataset automatically created during the evaluation run of model DreadPoor/ToppyLake-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T17:37:50.313114(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of DreadPoor/ToppyLake-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/ToppyLake-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T17:37:50.313114(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DreadPoor/ToppyLake-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/ToppyLake-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T17:37:50.313114(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DreadPoor/ToppyLake-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/ToppyLake-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T17:37:50.313114(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
cacf05246a0125b9b158393892f2dbc4300114d7 | # Dataset Card for "Big-Dataset-0214"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ouvic215/Big-Dataset-0214 | [
"region:us"
] | 2024-02-13T17:50:19+00:00 | {"dataset_info": {"features": [{"name": "mask_image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 1575504233.25, "num_examples": 19151}], "download_size": 1217667292, "dataset_size": 1575504233.25}} | 2024-02-13T17:53:05+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Big-Dataset-0214"
More Information needed | [
"# Dataset Card for \"Big-Dataset-0214\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Big-Dataset-0214\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"Big-Dataset-0214\"\n\nMore Information needed"
] |
caa50e4c4707c4fb6314cde7a0dfb08009a5db16 |
# Dataset Card for Evaluation run of arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5](https://huggingface.co/arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_arlineka__Brunhilde-2x7b-MOE-DPO-v.01.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T17:48:37.130833](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-2x7b-MOE-DPO-v.01.5/blob/main/results_2024-02-13T17-48-37.130833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527087935188053,
"acc_stderr": 0.032054116160077556,
"acc_norm": 0.6538210694369178,
"acc_norm_stderr": 0.0327027142406073,
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961564,
"mc2": 0.6547268141518184,
"mc2_stderr": 0.015285268254002284
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.013688147309729125,
"acc_norm": 0.6953924914675768,
"acc_norm_stderr": 0.01344952210993249
},
"harness|hellaswag|10": {
"acc": 0.6914957179844653,
"acc_stderr": 0.0046093200248938935,
"acc_norm": 0.8702449711212906,
"acc_norm_stderr": 0.003353469625027664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240658,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240658
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.016574027219517635,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.016574027219517635
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729477,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961564,
"mc2": 0.6547268141518184,
"mc2_stderr": 0.015285268254002284
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510427
},
"harness|gsm8k|5": {
"acc": 0.6300227445034117,
"acc_stderr": 0.013298661207727127
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_arlineka__Brunhilde-2x7b-MOE-DPO-v.01.5 | [
"region:us"
] | 2024-02-13T17:50:51+00:00 | {"pretty_name": "Evaluation run of arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5](https://huggingface.co/arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arlineka__Brunhilde-2x7b-MOE-DPO-v.01.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T17:48:37.130833](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-2x7b-MOE-DPO-v.01.5/blob/main/results_2024-02-13T17-48-37.130833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527087935188053,\n \"acc_stderr\": 0.032054116160077556,\n \"acc_norm\": 0.6538210694369178,\n \"acc_norm_stderr\": 0.0327027142406073,\n \"mc1\": 0.5238678090575275,\n \"mc1_stderr\": 0.017483547156961564,\n \"mc2\": 0.6547268141518184,\n \"mc2_stderr\": 0.015285268254002284\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.013688147309729125,\n \"acc_norm\": 0.6953924914675768,\n \"acc_norm_stderr\": 0.01344952210993249\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6914957179844653,\n \"acc_stderr\": 0.0046093200248938935,\n \"acc_norm\": 0.8702449711212906,\n \"acc_norm_stderr\": 0.003353469625027664\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137282,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137282\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240658,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240658\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729477,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729477\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5238678090575275,\n \"mc1_stderr\": 0.017483547156961564,\n \"mc2\": 0.6547268141518184,\n \"mc2_stderr\": 0.015285268254002284\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510427\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6300227445034117,\n \"acc_stderr\": 0.013298661207727127\n }\n}\n```", "repo_url": "https://huggingface.co/arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|arc:challenge|25_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|gsm8k|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hellaswag|10_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T17-48-37.130833.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["**/details_harness|winogrande|5_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T17-48-37.130833.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T17_48_37.130833", "path": ["results_2024-02-13T17-48-37.130833.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T17-48-37.130833.parquet"]}]}]} | 2024-02-13T17:51:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5
Dataset automatically created during the evaluation run of model arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T17:48:37.130833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5\n\n\n\nDataset automatically created during the evaluation run of model arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T17:48:37.130833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5\n\n\n\nDataset automatically created during the evaluation run of model arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T17:48:37.130833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
205,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5\n\n\n\nDataset automatically created during the evaluation run of model arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T17:48:37.130833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
4ecb6b47fb1cd1a637e2f33169b18e139607bd76 |
This is a slightly reformatted (split spans and labels) version of the SubstanReview dataset, the original can be found at https://github.com/YanzhuGuo/SubstanReview. | kdercksen/substanreview | [
"region:us"
] | 2024-02-13T18:16:26+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "review", "dtype": "string"}, {"name": "spans", "sequence": {"sequence": "int64"}}, {"name": "labels", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 1264992, "num_examples": 440}, {"name": "test", "num_bytes": 299386, "num_examples": 110}], "download_size": 840892, "dataset_size": 1564378}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-13T18:17:25+00:00 | [] | [] | TAGS
#region-us
|
This is a slightly reformatted (split spans and labels) version of the SubstanReview dataset, the original can be found at URL | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
359e47b958e05f79d585a80d5c3b73d177c6eeeb | # Dataset Card for "stenotype-eval-type-stripped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | franlucc/stenotype-eval-type-stripped | [
"region:us"
] | 2024-02-13T18:17:35+00:00 | {"dataset_info": {"features": [{"name": "hexsha", "dtype": "string"}, {"name": "size", "dtype": "int64"}, {"name": "ext", "dtype": "string"}, {"name": "lang", "dtype": "string"}, {"name": "max_stars_repo_path", "dtype": "string"}, {"name": "max_stars_repo_name", "dtype": "string"}, {"name": "max_stars_repo_head_hexsha", "dtype": "string"}, {"name": "max_stars_repo_licenses", "sequence": "string"}, {"name": "max_stars_count", "dtype": "float64"}, {"name": "max_stars_repo_stars_event_min_datetime", "dtype": "string"}, {"name": "max_stars_repo_stars_event_max_datetime", "dtype": "string"}, {"name": "max_issues_repo_path", "dtype": "string"}, {"name": "max_issues_repo_name", "dtype": "string"}, {"name": "max_issues_repo_head_hexsha", "dtype": "string"}, {"name": "max_issues_repo_licenses", "sequence": "string"}, {"name": "max_issues_count", "dtype": "float64"}, {"name": "max_issues_repo_issues_event_min_datetime", "dtype": "string"}, {"name": "max_issues_repo_issues_event_max_datetime", "dtype": "string"}, {"name": "max_forks_repo_path", "dtype": "string"}, {"name": "max_forks_repo_name", "dtype": "string"}, {"name": "max_forks_repo_head_hexsha", "dtype": "string"}, {"name": "max_forks_repo_licenses", "sequence": "string"}, {"name": "max_forks_count", "dtype": "float64"}, {"name": "max_forks_repo_forks_event_min_datetime", "dtype": "string"}, {"name": "max_forks_repo_forks_event_max_datetime", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "avg_line_length", "dtype": "float64"}, {"name": "max_line_length", "dtype": "int64"}, {"name": "alphanum_fraction", "dtype": "float64"}, {"name": "annotation_sites", "dtype": "int64"}, {"name": "type_definitions", "dtype": "int64"}, {"name": "loc", "dtype": "int64"}, {"name": "functions", "dtype": "int64"}, {"name": "loc_per_function", "dtype": "float64"}, {"name": "estimated_tokens", "dtype": "int64"}, {"name": "content_type_removed", "dtype": "string"}, {"name": "type_map", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3164953, "num_examples": 338}], "download_size": 1310544, "dataset_size": 3164953}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-13T18:17:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "stenotype-eval-type-stripped"
More Information needed | [
"# Dataset Card for \"stenotype-eval-type-stripped\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"stenotype-eval-type-stripped\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"stenotype-eval-type-stripped\"\n\nMore Information needed"
] |
5a63e98d8bc5ab89f4f2a2cd33a3ace514ebe1ef | # Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)
## Project Overview
The Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.
## Background and Rationale
Durham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.
## Data Sources and Methodology
### Data Sources
We will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:
- `GS_TreeInventory.shp`
- `Trees_&_Planting_Sites.csv`
- `Trees_%26_Planting_Sites.geojson`
# Dataset Card for Urban Tree Inventory
## Dataset Description
This dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.
### Spatial Data (GeoJSON)
**Format:** GeoJSON
**Content:**
- **Type:** `FeatureCollection` - A collection of feature objects.
- **Features:** Each feature object represents a tree and contains:
- **Type:** `Feature`
- **Geometry:** `Point` (includes longitude and latitude of the tree location).
- **Properties:** Detailed information about the tree (some fields may overlap with the CSV structure below).
### Tabular Data (CSV)
**Format:** CSV
**Columns:**
- **X, Y:** Coordinates of the tree location.
- **OBJECTID:** Unique identifier for the tree.
- **streetaddress:** Street address nearest to the tree.
- **city:** City where the tree is located.
- **zipcode:** Zip code for the location of the tree.
- **facilityid:** Identifier for the facility associated with the tree, if any.
- **present:** Indication of whether the tree is currently present.
- **genus, species, commonname:** Botanical and common names of the tree.
- **plantingdate:** Date when the tree was planted.
- **diameterin:** Diameter of the tree trunk in inches.
- **heightft:** Height of the tree in feet.
- **condition:** Health condition of the tree.
- **contractwork:** Indicates if the tree has had any contract work done.
- **neighborhood:** Neighborhood where the tree is located.
- **program:** The program under which the tree was planted.
- **plantingw:** Width of the planting site.
- **plantingcond:** Condition of the planting site.
- **underpwerlins:** Whether the tree is under power lines.
- **matureheight:** The mature height of the tree.
- **GlobalID:** A global unique identifier for the tree.
- **created_user:** The user who created the record.
- **created_date:** The date the record was created.
- **last_edited_user:** The user who last edited the record.
- **last_edited_date:** The date the record was last edited.
#### Environmental and Economic Data:
- **isoprene, monoterpene, vocs:** Emissions and absorption data for various compounds.
- **coremoved_ozperyr, o3removed_ozperyr, etc.:** Annual pollutant removal metrics.
- **o2production_lbperyr:** Annual oxygen production.
- **carbonstorage_lb, carbonstorage_dol:** Carbon storage metrics.
- **grosscarseq_lbperyr, grosscarseq_dolperyr:** Gross carbon sequestration.
- **avoidrunoff_ft2peryr, avoidrunoff_dol2peryr:** Metrics related to stormwater runoff avoidance.
- **totannbenefits_dolperyr:** Total annual dollar benefits from the tree.
- **leafarea_sqft, potevapotran_cuftperyr, etc.:** Metrics related to the water cycle.
- **heating_mbtuperyr, cooling_kwhperyr, etc.:** Energy savings related to the tree's impact on building energy use.
### Example Record
**GeoJSON Feature:**
```json
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [-78.90863, 36.00441]
},
"properties": {
"OBJECTID": 2840940,
"commonname": "Willow Oak",
// Additional properties...
}
}
```
The `GS_TreeInventory.shp` file encompasses a range of attributes for each record:
- **OBJECTID:** Unique identifier for each record.
- **streetaddr:** Street address where the tree or planting site is located.
- **city:** The city name, which is Durham.
- **zipcode:** Postal code for the location.
- **facilityid:** Identifier possibly linked to a facility or area associated with the tree.
- **present:** Type of feature present, such as a tree or a planting site.
- **genus:** Genus of the tree.
- **species:** Species of the tree.
- **commonname:** Common name of the tree.
- **plantingda:** Date or year range when the tree was planted or the planting site was established.
- ...
### Objectives
1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.
2. Apply Python libraries to uncover relationships between tree canopy data and urban development.
3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.
4. Produce analyses and visualizations with the GeoJSON file.
### Methodology
Our analytical process within Google Colab will encompass:
- **Data Preparation and Integration:** Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.
- **Geospatial Analysis:** Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.
- **Visualization and Optimization:** Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.
## Deliverables
1. A collection of Google Colab Python notebooks that outline our analytical processes.
2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.
3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.
## Limitations
- **Computational Resources:** The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.
- **Data Quality:** The accuracy and currency of the data ultimately affect the precision of our recommendations.
- **Sociopolitical Considerations:** Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.
## Conclusion
DUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment. | Ziyuan111/Urban_Tree_Canopy_in_Durham | [
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-13T18:54:29+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"]} | 2024-02-17T14:46:57+00:00 | [] | [
"en"
] | TAGS
#size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
| # Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)
## Project Overview
The Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.
## Background and Rationale
Durham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.
## Data Sources and Methodology
### Data Sources
We will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:
- 'GS_TreeInventory.shp'
- 'Trees_&_Planting_Sites.csv'
- 'Trees_%26_Planting_Sites.geojson'
# Dataset Card for Urban Tree Inventory
## Dataset Description
This dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.
### Spatial Data (GeoJSON)
Format: GeoJSON
Content:
- Type: 'FeatureCollection' - A collection of feature objects.
- Features: Each feature object represents a tree and contains:
- Type: 'Feature'
- Geometry: 'Point' (includes longitude and latitude of the tree location).
- Properties: Detailed information about the tree (some fields may overlap with the CSV structure below).
### Tabular Data (CSV)
Format: CSV
Columns:
- X, Y: Coordinates of the tree location.
- OBJECTID: Unique identifier for the tree.
- streetaddress: Street address nearest to the tree.
- city: City where the tree is located.
- zipcode: Zip code for the location of the tree.
- facilityid: Identifier for the facility associated with the tree, if any.
- present: Indication of whether the tree is currently present.
- genus, species, commonname: Botanical and common names of the tree.
- plantingdate: Date when the tree was planted.
- diameterin: Diameter of the tree trunk in inches.
- heightft: Height of the tree in feet.
- condition: Health condition of the tree.
- contractwork: Indicates if the tree has had any contract work done.
- neighborhood: Neighborhood where the tree is located.
- program: The program under which the tree was planted.
- plantingw: Width of the planting site.
- plantingcond: Condition of the planting site.
- underpwerlins: Whether the tree is under power lines.
- matureheight: The mature height of the tree.
- GlobalID: A global unique identifier for the tree.
- created_user: The user who created the record.
- created_date: The date the record was created.
- last_edited_user: The user who last edited the record.
- last_edited_date: The date the record was last edited.
#### Environmental and Economic Data:
- isoprene, monoterpene, vocs: Emissions and absorption data for various compounds.
- coremoved_ozperyr, o3removed_ozperyr, etc.: Annual pollutant removal metrics.
- o2production_lbperyr: Annual oxygen production.
- carbonstorage_lb, carbonstorage_dol: Carbon storage metrics.
- grosscarseq_lbperyr, grosscarseq_dolperyr: Gross carbon sequestration.
- avoidrunoff_ft2peryr, avoidrunoff_dol2peryr: Metrics related to stormwater runoff avoidance.
- totannbenefits_dolperyr: Total annual dollar benefits from the tree.
- leafarea_sqft, potevapotran_cuftperyr, etc.: Metrics related to the water cycle.
- heating_mbtuperyr, cooling_kwhperyr, etc.: Energy savings related to the tree's impact on building energy use.
### Example Record
GeoJSON Feature:
The 'GS_TreeInventory.shp' file encompasses a range of attributes for each record:
- OBJECTID: Unique identifier for each record.
- streetaddr: Street address where the tree or planting site is located.
- city: The city name, which is Durham.
- zipcode: Postal code for the location.
- facilityid: Identifier possibly linked to a facility or area associated with the tree.
- present: Type of feature present, such as a tree or a planting site.
- genus: Genus of the tree.
- species: Species of the tree.
- commonname: Common name of the tree.
- plantingda: Date or year range when the tree was planted or the planting site was established.
- ...
### Objectives
1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.
2. Apply Python libraries to uncover relationships between tree canopy data and urban development.
3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.
4. Produce analyses and visualizations with the GeoJSON file.
### Methodology
Our analytical process within Google Colab will encompass:
- Data Preparation and Integration: Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.
- Geospatial Analysis: Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.
- Visualization and Optimization: Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.
## Deliverables
1. A collection of Google Colab Python notebooks that outline our analytical processes.
2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.
3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.
## Limitations
- Computational Resources: The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.
- Data Quality: The accuracy and currency of the data ultimately affect the precision of our recommendations.
- Sociopolitical Considerations: Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.
## Conclusion
DUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment. | [
"# Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)",
"## Project Overview\n\nThe Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.",
"## Background and Rationale\n\nDurham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.",
"## Data Sources and Methodology",
"### Data Sources\n\nWe will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:\n\n- 'GS_TreeInventory.shp'\n- 'Trees_&_Planting_Sites.csv'\n- 'Trees_%26_Planting_Sites.geojson'",
"# Dataset Card for Urban Tree Inventory",
"## Dataset Description\n\nThis dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.",
"### Spatial Data (GeoJSON)\n\nFormat: GeoJSON\n\nContent:\n\n- Type: 'FeatureCollection' - A collection of feature objects.\n- Features: Each feature object represents a tree and contains:\n - Type: 'Feature'\n - Geometry: 'Point' (includes longitude and latitude of the tree location).\n - Properties: Detailed information about the tree (some fields may overlap with the CSV structure below).",
"### Tabular Data (CSV)\n\nFormat: CSV\n\nColumns:\n\n- X, Y: Coordinates of the tree location.\n- OBJECTID: Unique identifier for the tree.\n- streetaddress: Street address nearest to the tree.\n- city: City where the tree is located.\n- zipcode: Zip code for the location of the tree.\n- facilityid: Identifier for the facility associated with the tree, if any.\n- present: Indication of whether the tree is currently present.\n- genus, species, commonname: Botanical and common names of the tree.\n- plantingdate: Date when the tree was planted.\n- diameterin: Diameter of the tree trunk in inches.\n- heightft: Height of the tree in feet.\n- condition: Health condition of the tree.\n- contractwork: Indicates if the tree has had any contract work done.\n- neighborhood: Neighborhood where the tree is located.\n- program: The program under which the tree was planted.\n- plantingw: Width of the planting site.\n- plantingcond: Condition of the planting site.\n- underpwerlins: Whether the tree is under power lines.\n- matureheight: The mature height of the tree.\n- GlobalID: A global unique identifier for the tree.\n- created_user: The user who created the record.\n- created_date: The date the record was created.\n- last_edited_user: The user who last edited the record.\n- last_edited_date: The date the record was last edited.",
"#### Environmental and Economic Data:\n\n- isoprene, monoterpene, vocs: Emissions and absorption data for various compounds.\n- coremoved_ozperyr, o3removed_ozperyr, etc.: Annual pollutant removal metrics.\n- o2production_lbperyr: Annual oxygen production.\n- carbonstorage_lb, carbonstorage_dol: Carbon storage metrics.\n- grosscarseq_lbperyr, grosscarseq_dolperyr: Gross carbon sequestration.\n- avoidrunoff_ft2peryr, avoidrunoff_dol2peryr: Metrics related to stormwater runoff avoidance.\n- totannbenefits_dolperyr: Total annual dollar benefits from the tree.\n- leafarea_sqft, potevapotran_cuftperyr, etc.: Metrics related to the water cycle.\n- heating_mbtuperyr, cooling_kwhperyr, etc.: Energy savings related to the tree's impact on building energy use.",
"### Example Record\n\nGeoJSON Feature:\n\nThe 'GS_TreeInventory.shp' file encompasses a range of attributes for each record:\n\n- OBJECTID: Unique identifier for each record.\n- streetaddr: Street address where the tree or planting site is located.\n- city: The city name, which is Durham.\n- zipcode: Postal code for the location.\n- facilityid: Identifier possibly linked to a facility or area associated with the tree.\n- present: Type of feature present, such as a tree or a planting site.\n- genus: Genus of the tree.\n- species: Species of the tree.\n- commonname: Common name of the tree.\n- plantingda: Date or year range when the tree was planted or the planting site was established.\n- ...",
"### Objectives\n\n1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.\n2. Apply Python libraries to uncover relationships between tree canopy data and urban development.\n3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.\n4. Produce analyses and visualizations with the GeoJSON file.",
"### Methodology\n\nOur analytical process within Google Colab will encompass:\n\n- Data Preparation and Integration: Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.\n- Geospatial Analysis: Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.\n- Visualization and Optimization: Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.",
"## Deliverables\n\n1. A collection of Google Colab Python notebooks that outline our analytical processes.\n2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.\n3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.",
"## Limitations\n\n- Computational Resources: The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.\n- Data Quality: The accuracy and currency of the data ultimately affect the precision of our recommendations.\n- Sociopolitical Considerations: Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.",
"## Conclusion\n\nDUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment."
] | [
"TAGS\n#size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n",
"# Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)",
"## Project Overview\n\nThe Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.",
"## Background and Rationale\n\nDurham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.",
"## Data Sources and Methodology",
"### Data Sources\n\nWe will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:\n\n- 'GS_TreeInventory.shp'\n- 'Trees_&_Planting_Sites.csv'\n- 'Trees_%26_Planting_Sites.geojson'",
"# Dataset Card for Urban Tree Inventory",
"## Dataset Description\n\nThis dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.",
"### Spatial Data (GeoJSON)\n\nFormat: GeoJSON\n\nContent:\n\n- Type: 'FeatureCollection' - A collection of feature objects.\n- Features: Each feature object represents a tree and contains:\n - Type: 'Feature'\n - Geometry: 'Point' (includes longitude and latitude of the tree location).\n - Properties: Detailed information about the tree (some fields may overlap with the CSV structure below).",
"### Tabular Data (CSV)\n\nFormat: CSV\n\nColumns:\n\n- X, Y: Coordinates of the tree location.\n- OBJECTID: Unique identifier for the tree.\n- streetaddress: Street address nearest to the tree.\n- city: City where the tree is located.\n- zipcode: Zip code for the location of the tree.\n- facilityid: Identifier for the facility associated with the tree, if any.\n- present: Indication of whether the tree is currently present.\n- genus, species, commonname: Botanical and common names of the tree.\n- plantingdate: Date when the tree was planted.\n- diameterin: Diameter of the tree trunk in inches.\n- heightft: Height of the tree in feet.\n- condition: Health condition of the tree.\n- contractwork: Indicates if the tree has had any contract work done.\n- neighborhood: Neighborhood where the tree is located.\n- program: The program under which the tree was planted.\n- plantingw: Width of the planting site.\n- plantingcond: Condition of the planting site.\n- underpwerlins: Whether the tree is under power lines.\n- matureheight: The mature height of the tree.\n- GlobalID: A global unique identifier for the tree.\n- created_user: The user who created the record.\n- created_date: The date the record was created.\n- last_edited_user: The user who last edited the record.\n- last_edited_date: The date the record was last edited.",
"#### Environmental and Economic Data:\n\n- isoprene, monoterpene, vocs: Emissions and absorption data for various compounds.\n- coremoved_ozperyr, o3removed_ozperyr, etc.: Annual pollutant removal metrics.\n- o2production_lbperyr: Annual oxygen production.\n- carbonstorage_lb, carbonstorage_dol: Carbon storage metrics.\n- grosscarseq_lbperyr, grosscarseq_dolperyr: Gross carbon sequestration.\n- avoidrunoff_ft2peryr, avoidrunoff_dol2peryr: Metrics related to stormwater runoff avoidance.\n- totannbenefits_dolperyr: Total annual dollar benefits from the tree.\n- leafarea_sqft, potevapotran_cuftperyr, etc.: Metrics related to the water cycle.\n- heating_mbtuperyr, cooling_kwhperyr, etc.: Energy savings related to the tree's impact on building energy use.",
"### Example Record\n\nGeoJSON Feature:\n\nThe 'GS_TreeInventory.shp' file encompasses a range of attributes for each record:\n\n- OBJECTID: Unique identifier for each record.\n- streetaddr: Street address where the tree or planting site is located.\n- city: The city name, which is Durham.\n- zipcode: Postal code for the location.\n- facilityid: Identifier possibly linked to a facility or area associated with the tree.\n- present: Type of feature present, such as a tree or a planting site.\n- genus: Genus of the tree.\n- species: Species of the tree.\n- commonname: Common name of the tree.\n- plantingda: Date or year range when the tree was planted or the planting site was established.\n- ...",
"### Objectives\n\n1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.\n2. Apply Python libraries to uncover relationships between tree canopy data and urban development.\n3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.\n4. Produce analyses and visualizations with the GeoJSON file.",
"### Methodology\n\nOur analytical process within Google Colab will encompass:\n\n- Data Preparation and Integration: Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.\n- Geospatial Analysis: Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.\n- Visualization and Optimization: Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.",
"## Deliverables\n\n1. A collection of Google Colab Python notebooks that outline our analytical processes.\n2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.\n3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.",
"## Limitations\n\n- Computational Resources: The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.\n- Data Quality: The accuracy and currency of the data ultimately affect the precision of our recommendations.\n- Sociopolitical Considerations: Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.",
"## Conclusion\n\nDUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment."
] | [
30,
20,
108,
86,
7,
83,
10,
43,
105,
343,
235,
182,
86,
135,
69,
96,
63
] | [
"passage: TAGS\n#size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n# Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)## Project Overview\n\nThe Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.## Background and Rationale\n\nDurham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.## Data Sources and Methodology### Data Sources\n\nWe will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:\n\n- 'GS_TreeInventory.shp'\n- 'Trees_&_Planting_Sites.csv'\n- 'Trees_%26_Planting_Sites.geojson'# Dataset Card for Urban Tree Inventory## Dataset Description\n\nThis dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.### Spatial Data (GeoJSON)\n\nFormat: GeoJSON\n\nContent:\n\n- Type: 'FeatureCollection' - A collection of feature objects.\n- Features: Each feature object represents a tree and contains:\n - Type: 'Feature'\n - Geometry: 'Point' (includes longitude and latitude of the tree location).\n - Properties: Detailed information about the tree (some fields may overlap with the CSV structure below).",
"passage: ### Tabular Data (CSV)\n\nFormat: CSV\n\nColumns:\n\n- X, Y: Coordinates of the tree location.\n- OBJECTID: Unique identifier for the tree.\n- streetaddress: Street address nearest to the tree.\n- city: City where the tree is located.\n- zipcode: Zip code for the location of the tree.\n- facilityid: Identifier for the facility associated with the tree, if any.\n- present: Indication of whether the tree is currently present.\n- genus, species, commonname: Botanical and common names of the tree.\n- plantingdate: Date when the tree was planted.\n- diameterin: Diameter of the tree trunk in inches.\n- heightft: Height of the tree in feet.\n- condition: Health condition of the tree.\n- contractwork: Indicates if the tree has had any contract work done.\n- neighborhood: Neighborhood where the tree is located.\n- program: The program under which the tree was planted.\n- plantingw: Width of the planting site.\n- plantingcond: Condition of the planting site.\n- underpwerlins: Whether the tree is under power lines.\n- matureheight: The mature height of the tree.\n- GlobalID: A global unique identifier for the tree.\n- created_user: The user who created the record.\n- created_date: The date the record was created.\n- last_edited_user: The user who last edited the record.\n- last_edited_date: The date the record was last edited.#### Environmental and Economic Data:\n\n- isoprene, monoterpene, vocs: Emissions and absorption data for various compounds.\n- coremoved_ozperyr, o3removed_ozperyr, etc.: Annual pollutant removal metrics.\n- o2production_lbperyr: Annual oxygen production.\n- carbonstorage_lb, carbonstorage_dol: Carbon storage metrics.\n- grosscarseq_lbperyr, grosscarseq_dolperyr: Gross carbon sequestration.\n- avoidrunoff_ft2peryr, avoidrunoff_dol2peryr: Metrics related to stormwater runoff avoidance.\n- totannbenefits_dolperyr: Total annual dollar benefits from the tree.\n- leafarea_sqft, potevapotran_cuftperyr, etc.: Metrics related to the water cycle.\n- heating_mbtuperyr, cooling_kwhperyr, etc.: Energy savings related to the tree's impact on building energy use.### Example Record\n\nGeoJSON Feature:\n\nThe 'GS_TreeInventory.shp' file encompasses a range of attributes for each record:\n\n- OBJECTID: Unique identifier for each record.\n- streetaddr: Street address where the tree or planting site is located.\n- city: The city name, which is Durham.\n- zipcode: Postal code for the location.\n- facilityid: Identifier possibly linked to a facility or area associated with the tree.\n- present: Type of feature present, such as a tree or a planting site.\n- genus: Genus of the tree.\n- species: Species of the tree.\n- commonname: Common name of the tree.\n- plantingda: Date or year range when the tree was planted or the planting site was established.\n- ...### Objectives\n\n1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.\n2. Apply Python libraries to uncover relationships between tree canopy data and urban development.\n3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.\n4. Produce analyses and visualizations with the GeoJSON file."
] |
a243c3211a94dda95212ca9b461af333f8332a9e | <b>The OpenBible Project</b>
This is a custom dataset (single column text) of verses KJV, ASV, WLT and WEB. I'll be adding new Bible data soon, written in LORA for Bible question answering.
I have also taken the liberty to incorporate an opensource Bible Trivia from https://huggingface.co/datasets/liaaron1/bibile_trivia_alpaca and rearranged it to match my dataset.
I tried multiple attempts of incorporating few books of the Bible, but all models tested doesn't follow the Biblical logic, so I experimented on doing it with a larger corpus of Bible data and biblical text in order to give it more context.
I realize that almost every model these days fail to interact Biblically, so I have taken the initiative to give AI some scriptural logic to reason with humans, on everyday Christian text.
This is a work in progress and I'm committed to adding more features and data augmentation of the resulting model.
Created by: <b>Bob Reyes</b>
Creation date: February 14, 2024 | oliverbob/openbible | [
"license:apache-2.0",
"region:us"
] | 2024-02-13T18:56:51+00:00 | {"license": "apache-2.0"} | 2024-02-13T21:37:33+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| <b>The OpenBible Project</b>
This is a custom dataset (single column text) of verses KJV, ASV, WLT and WEB. I'll be adding new Bible data soon, written in LORA for Bible question answering.
I have also taken the liberty to incorporate an opensource Bible Trivia from URL and rearranged it to match my dataset.
I tried multiple attempts of incorporating few books of the Bible, but all models tested doesn't follow the Biblical logic, so I experimented on doing it with a larger corpus of Bible data and biblical text in order to give it more context.
I realize that almost every model these days fail to interact Biblically, so I have taken the initiative to give AI some scriptural logic to reason with humans, on everyday Christian text.
This is a work in progress and I'm committed to adding more features and data augmentation of the resulting model.
Created by: <b>Bob Reyes</b>
Creation date: February 14, 2024 | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] | [
14
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |
166f83fc54f723226241fcd83eb5c13d661e36de |
# Dataset Card for Evaluation run of DreadPoor/BagelToppyLake-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/BagelToppyLake-7B-slerp](https://huggingface.co/DreadPoor/BagelToppyLake-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__BagelToppyLake-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T19:00:17.803854](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__BagelToppyLake-7B-slerp/blob/main/results_2024-02-13T19-00-17.803854.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6450608281170044,
"acc_stderr": 0.03230282110443641,
"acc_norm": 0.64702543076818,
"acc_norm_stderr": 0.03296019853363821,
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.6215432793564798,
"mc2_stderr": 0.015396330957522903
},
"harness|arc:challenge|25": {
"acc": 0.6535836177474402,
"acc_stderr": 0.013905011180063228,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537302
},
"harness|hellaswag|10": {
"acc": 0.6711810396335391,
"acc_stderr": 0.004688239419302076,
"acc_norm": 0.8479386576379208,
"acc_norm_stderr": 0.0035834648107534598
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.02402225613030824,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.02402225613030824
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634335,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634335
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.012732398286190442,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.012732398286190442
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.6215432793564798,
"mc2_stderr": 0.015396330957522903
},
"harness|winogrande|5": {
"acc": 0.8184688239936859,
"acc_stderr": 0.010833276515007482
},
"harness|gsm8k|5": {
"acc": 0.5504169825625473,
"acc_stderr": 0.013702290047884749
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DreadPoor__BagelToppyLake-7B-slerp | [
"region:us"
] | 2024-02-13T19:02:35+00:00 | {"pretty_name": "Evaluation run of DreadPoor/BagelToppyLake-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [DreadPoor/BagelToppyLake-7B-slerp](https://huggingface.co/DreadPoor/BagelToppyLake-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__BagelToppyLake-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T19:00:17.803854](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__BagelToppyLake-7B-slerp/blob/main/results_2024-02-13T19-00-17.803854.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6450608281170044,\n \"acc_stderr\": 0.03230282110443641,\n \"acc_norm\": 0.64702543076818,\n \"acc_norm_stderr\": 0.03296019853363821,\n \"mc1\": 0.4541003671970624,\n \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.6215432793564798,\n \"mc2_stderr\": 0.015396330957522903\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6535836177474402,\n \"acc_stderr\": 0.013905011180063228,\n \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537302\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6711810396335391,\n \"acc_stderr\": 0.004688239419302076,\n \"acc_norm\": 0.8479386576379208,\n \"acc_norm_stderr\": 0.0035834648107534598\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.02402225613030824,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.02402225613030824\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634335,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634335\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.012732398286190442,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.012732398286190442\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4541003671970624,\n \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.6215432793564798,\n \"mc2_stderr\": 0.015396330957522903\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.010833276515007482\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5504169825625473,\n \"acc_stderr\": 0.013702290047884749\n }\n}\n```", "repo_url": "https://huggingface.co/DreadPoor/BagelToppyLake-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|arc:challenge|25_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|gsm8k|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hellaswag|10_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T19-00-17.803854.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["**/details_harness|winogrande|5_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T19-00-17.803854.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T19_00_17.803854", "path": ["results_2024-02-13T19-00-17.803854.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T19-00-17.803854.parquet"]}]}]} | 2024-02-13T19:02:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DreadPoor/BagelToppyLake-7B-slerp
Dataset automatically created during the evaluation run of model DreadPoor/BagelToppyLake-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T19:00:17.803854(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of DreadPoor/BagelToppyLake-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/BagelToppyLake-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T19:00:17.803854(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DreadPoor/BagelToppyLake-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/BagelToppyLake-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T19:00:17.803854(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DreadPoor/BagelToppyLake-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/BagelToppyLake-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T19:00:17.803854(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
052b74667c3f7052eac7973e8d19519c501e7591 |
# Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v10
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/rainbowfish-7B-v10](https://huggingface.co/andysalerno/rainbowfish-7B-v10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v10",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T19:38:34.787963](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v10/blob/main/results_2024-02-13T19-38-34.787963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6298117973385691,
"acc_stderr": 0.032399345805451785,
"acc_norm": 0.6355056831439578,
"acc_norm_stderr": 0.03305466909667548,
"mc1": 0.3329253365973072,
"mc1_stderr": 0.016497402382012055,
"mc2": 0.49453359327829777,
"mc2_stderr": 0.015133317669005187
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.014438036220848029,
"acc_norm": 0.6117747440273038,
"acc_norm_stderr": 0.014241614207414044
},
"harness|hellaswag|10": {
"acc": 0.6314479187412866,
"acc_stderr": 0.004814261966376849,
"acc_norm": 0.8233419637522406,
"acc_norm_stderr": 0.003805996119440377
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.037161774375660185,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.037161774375660185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.03692820767264866,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.03692820767264866
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105652,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105652
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091105,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091105
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296417,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296417
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.015566392630057031,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.015566392630057031
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559806,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983576,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3329253365973072,
"mc1_stderr": 0.016497402382012055,
"mc2": 0.49453359327829777,
"mc2_stderr": 0.015133317669005187
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.36997725549658833,
"acc_stderr": 0.01329866120772713
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v10 | [
"region:us"
] | 2024-02-13T19:40:53+00:00 | {"pretty_name": "Evaluation run of andysalerno/rainbowfish-7B-v10", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/rainbowfish-7B-v10](https://huggingface.co/andysalerno/rainbowfish-7B-v10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v10\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T19:38:34.787963](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v10/blob/main/results_2024-02-13T19-38-34.787963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6298117973385691,\n \"acc_stderr\": 0.032399345805451785,\n \"acc_norm\": 0.6355056831439578,\n \"acc_norm_stderr\": 0.03305466909667548,\n \"mc1\": 0.3329253365973072,\n \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.49453359327829777,\n \"mc2_stderr\": 0.015133317669005187\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848029,\n \"acc_norm\": 0.6117747440273038,\n \"acc_norm_stderr\": 0.014241614207414044\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6314479187412866,\n \"acc_stderr\": 0.004814261966376849,\n \"acc_norm\": 0.8233419637522406,\n \"acc_norm_stderr\": 0.003805996119440377\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.037161774375660185,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.037161774375660185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105652,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105652\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091105,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091105\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.014283378044296417,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.014283378044296417\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532337,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532337\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n \"acc_stderr\": 0.015566392630057031,\n \"acc_norm\": 0.31731843575418994,\n \"acc_norm_stderr\": 0.015566392630057031\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n \"acc_stderr\": 0.012702317490559806,\n \"acc_norm\": 0.4485006518904824,\n \"acc_norm_stderr\": 0.012702317490559806\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983576,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983576\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3329253365973072,\n \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.49453359327829777,\n \"mc2_stderr\": 0.015133317669005187\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36997725549658833,\n \"acc_stderr\": 0.01329866120772713\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/rainbowfish-7B-v10", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|arc:challenge|25_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|gsm8k|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hellaswag|10_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T19-38-34.787963.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["**/details_harness|winogrande|5_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T19-38-34.787963.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T19_38_34.787963", "path": ["results_2024-02-13T19-38-34.787963.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T19-38-34.787963.parquet"]}]}]} | 2024-02-13T19:41:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v10
Dataset automatically created during the evaluation run of model andysalerno/rainbowfish-7B-v10 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T19:38:34.787963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v10\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/rainbowfish-7B-v10 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T19:38:34.787963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v10\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/rainbowfish-7B-v10 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T19:38:34.787963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v10\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/rainbowfish-7B-v10 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T19:38:34.787963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
fa4aebd17c54c133ec17ea4a3b5219f19a6f2f0a | # Dataset Card for "samantar_merged_with_train_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mlsquare/samantar_merged_with_train_val | [
"region:us"
] | 2024-02-13T19:45:03+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "tgt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10796526292, "num_examples": 79638793}, {"name": "valid", "num_bytes": 2486522033, "num_examples": 19909699}], "download_size": 7507176021, "dataset_size": 13283048325}} | 2024-02-13T19:51:39+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "samantar_merged_with_train_val"
More Information needed | [
"# Dataset Card for \"samantar_merged_with_train_val\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"samantar_merged_with_train_val\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"samantar_merged_with_train_val\"\n\nMore Information needed"
] |
9ff46324fbee403da9d4df28f6ab7f29574f93a1 |
# Dataset Card for Evaluation run of FelixChao/Cygnus-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/Cygnus-7B](https://huggingface.co/FelixChao/Cygnus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__Cygnus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T19:53:00.035091](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Cygnus-7B/blob/main/results_2024-02-13T19-53-00.035091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6442703298905322,
"acc_stderr": 0.03219446775786222,
"acc_norm": 0.6433690342269983,
"acc_norm_stderr": 0.032867952417734415,
"mc1": 0.5556915544675642,
"mc1_stderr": 0.017394586250743176,
"mc2": 0.7261410824293602,
"mc2_stderr": 0.014546522781456774
},
"harness|arc:challenge|25": {
"acc": 0.6860068259385665,
"acc_stderr": 0.013562691224726297,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907597
},
"harness|hellaswag|10": {
"acc": 0.6920932085241984,
"acc_stderr": 0.004606843344517466,
"acc_norm": 0.8782115116510655,
"acc_norm_stderr": 0.003263729817698783
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233483,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233483
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4022346368715084,
"acc_stderr": 0.01639971673284714,
"acc_norm": 0.4022346368715084,
"acc_norm_stderr": 0.01639971673284714
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532069,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5556915544675642,
"mc1_stderr": 0.017394586250743176,
"mc2": 0.7261410824293602,
"mc2_stderr": 0.014546522781456774
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613981
},
"harness|gsm8k|5": {
"acc": 0.731614859742229,
"acc_stderr": 0.012205702688013667
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__Cygnus-7B | [
"region:us"
] | 2024-02-13T19:55:19+00:00 | {"pretty_name": "Evaluation run of FelixChao/Cygnus-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Cygnus-7B](https://huggingface.co/FelixChao/Cygnus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Cygnus-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T19:53:00.035091](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Cygnus-7B/blob/main/results_2024-02-13T19-53-00.035091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6442703298905322,\n \"acc_stderr\": 0.03219446775786222,\n \"acc_norm\": 0.6433690342269983,\n \"acc_norm_stderr\": 0.032867952417734415,\n \"mc1\": 0.5556915544675642,\n \"mc1_stderr\": 0.017394586250743176,\n \"mc2\": 0.7261410824293602,\n \"mc2_stderr\": 0.014546522781456774\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726297,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907597\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6920932085241984,\n \"acc_stderr\": 0.004606843344517466,\n \"acc_norm\": 0.8782115116510655,\n \"acc_norm_stderr\": 0.003263729817698783\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233483,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233483\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4022346368715084,\n \"acc_stderr\": 0.01639971673284714,\n \"acc_norm\": 0.4022346368715084,\n \"acc_norm_stderr\": 0.01639971673284714\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532069,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532069\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5556915544675642,\n \"mc1_stderr\": 0.017394586250743176,\n \"mc2\": 0.7261410824293602,\n \"mc2_stderr\": 0.014546522781456774\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613981\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.731614859742229,\n \"acc_stderr\": 0.012205702688013667\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Cygnus-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|arc:challenge|25_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|gsm8k|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hellaswag|10_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T19-53-00.035091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["**/details_harness|winogrande|5_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T19-53-00.035091.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T19_53_00.035091", "path": ["results_2024-02-13T19-53-00.035091.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T19-53-00.035091.parquet"]}]}]} | 2024-02-13T19:55:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/Cygnus-7B
Dataset automatically created during the evaluation run of model FelixChao/Cygnus-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T19:53:00.035091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FelixChao/Cygnus-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Cygnus-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T19:53:00.035091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/Cygnus-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Cygnus-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T19:53:00.035091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FelixChao/Cygnus-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Cygnus-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T19:53:00.035091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
c6777150c07821fb54d36eac4496f001d4957b53 |
Dataset for training an entailment classifier to recognize approval/disapproval of politicians.
Documents are Tweets from Kawintiranon (2022), the MTSD dataset, as well as Tweets and sentences taken weekly newsletters for select politicians from the 115th, 116th, and 117th congress.
Documents are triple coded -- once from the original compilers of the dataset, once from GPT-4, and a third time to adjudicate discrepancies between the two.
Twitter handles from politicians in the dataset have been replaced by their name. Be aware that "rt @realdonaldtrump You're a liar!" has been replaced with "rt trump You're a liar!",
And means the author is retweeting Trump calling someone else a liar, not the author calling Trump a liar.
Stance:
-1: Against: The document is critical of the target.
0: Neutral: The document doesn't express an opinion about the target or it can't be determined what the expressed opinion is with the given context.
1: Support: The document expresses support for the target. Expressing collaboration on bills or letters is considered support.
Label:
0: Entail
1: Not Entail
In addition to 1,000 documents about targets in the training set, the test set contains documents about 6 politicians not included in the training or validation data:
- Ted Cruz
- Hakeem Jeffries
- Madison Cawthorn
- Alexandria Ocasio-Cortez
- Mitt Romney
- Kyrsten Sinema | mlburnham/PoliStance_Affect | [
"task_categories:zero-shot-classification",
"license:mit",
"region:us"
] | 2024-02-13T20:26:41+00:00 | {"license": "mit", "task_categories": ["zero-shot-classification"], "pretty name": "PoliStance Affect", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "premise", "dtype": "string"}, {"name": "target", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "stance", "dtype": "int32"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 5392570, "num_examples": 17164}, {"name": "validation", "num_bytes": 1327661, "num_examples": 4291}, {"name": "test", "num_bytes": 1633230, "num_examples": 5383}], "download_size": 4211025, "dataset_size": 8353461}} | 2024-02-13T21:38:58+00:00 | [] | [] | TAGS
#task_categories-zero-shot-classification #license-mit #region-us
|
Dataset for training an entailment classifier to recognize approval/disapproval of politicians.
Documents are Tweets from Kawintiranon (2022), the MTSD dataset, as well as Tweets and sentences taken weekly newsletters for select politicians from the 115th, 116th, and 117th congress.
Documents are triple coded -- once from the original compilers of the dataset, once from GPT-4, and a third time to adjudicate discrepancies between the two.
Twitter handles from politicians in the dataset have been replaced by their name. Be aware that "rt @realdonaldtrump You're a liar!" has been replaced with "rt trump You're a liar!",
And means the author is retweeting Trump calling someone else a liar, not the author calling Trump a liar.
Stance:
-1: Against: The document is critical of the target.
0: Neutral: The document doesn't express an opinion about the target or it can't be determined what the expressed opinion is with the given context.
1: Support: The document expresses support for the target. Expressing collaboration on bills or letters is considered support.
Label:
0: Entail
1: Not Entail
In addition to 1,000 documents about targets in the training set, the test set contains documents about 6 politicians not included in the training or validation data:
- Ted Cruz
- Hakeem Jeffries
- Madison Cawthorn
- Alexandria Ocasio-Cortez
- Mitt Romney
- Kyrsten Sinema | [] | [
"TAGS\n#task_categories-zero-shot-classification #license-mit #region-us \n"
] | [
24
] | [
"passage: TAGS\n#task_categories-zero-shot-classification #license-mit #region-us \n"
] |
63781379fbe5f823649d19dfdb9632541e0cb275 |
This dataset is similar to [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs) with slightly less entries and replacement of GPT3.5 answer by GPT4 Turbo answers. | yleo/emerton_dpo_pairs | [
"region:us"
] | 2024-02-13T20:30:31+00:00 | {"dataset_info": {"features": [{"name": "system", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15630325.68343667, "num_examples": 5489}], "download_size": 9101980, "dataset_size": 15630325.68343667}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T12:13:07+00:00 | [] | [] | TAGS
#region-us
|
This dataset is similar to Intel/orca_dpo_pairs with slightly less entries and replacement of GPT3.5 answer by GPT4 Turbo answers. | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
51fc74f352634ef37f12d279bc7ac41fd2f8882f |
# Dataset Card for Evaluation run of Technoculture/mtor
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/mtor](https://huggingface.co/Technoculture/mtor) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__mtor",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T20:28:47.252845](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__mtor/blob/main/results_2024-02-13T20-28-47.252845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.243036791690957,
"acc_stderr": 0.030443634604007775,
"acc_norm": 0.2436435956890849,
"acc_norm_stderr": 0.0312521901068886,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4968235925779197,
"mc2_stderr": 0.01657342542218468
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.012174896631202607,
"acc_norm": 0.27303754266211605,
"acc_norm_stderr": 0.013019332762635739
},
"harness|hellaswag|10": {
"acc": 0.25761800438159727,
"acc_stderr": 0.004364287353415455,
"acc_norm": 0.2621987651862179,
"acc_norm_stderr": 0.00438931274801215
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.0301675334686327,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.0301675334686327
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.026055296901152915,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.026055296901152915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.029605623981771204,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.029605623981771204
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.020842290930114683,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.020842290930114683
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928724,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928724
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.029225575892489614,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.029225575892489614
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817227,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817227
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671549,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671549
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1834862385321101,
"acc_stderr": 0.01659525971039932,
"acc_norm": 0.1834862385321101,
"acc_norm_stderr": 0.01659525971039932
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18981481481481483,
"acc_stderr": 0.026744714834691933,
"acc_norm": 0.18981481481481483,
"acc_norm_stderr": 0.026744714834691933
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891155,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891155
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.280970625798212,
"acc_stderr": 0.016073127851221246,
"acc_norm": 0.280970625798212,
"acc_norm_stderr": 0.016073127851221246
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676644,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676644
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.02604066247420125,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.02604066247420125
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.017362473762146637,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.017362473762146637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017183,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017183
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4968235925779197,
"mc2_stderr": 0.01657342542218468
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.0140519560640769
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Technoculture__mtor | [
"region:us"
] | 2024-02-13T20:31:07+00:00 | {"pretty_name": "Evaluation run of Technoculture/mtor", "dataset_summary": "Dataset automatically created during the evaluation run of model [Technoculture/mtor](https://huggingface.co/Technoculture/mtor) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__mtor\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T20:28:47.252845](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__mtor/blob/main/results_2024-02-13T20-28-47.252845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.243036791690957,\n \"acc_stderr\": 0.030443634604007775,\n \"acc_norm\": 0.2436435956890849,\n \"acc_norm_stderr\": 0.0312521901068886,\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.4968235925779197,\n \"mc2_stderr\": 0.01657342542218468\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202607,\n \"acc_norm\": 0.27303754266211605,\n \"acc_norm_stderr\": 0.013019332762635739\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25761800438159727,\n \"acc_stderr\": 0.004364287353415455,\n \"acc_norm\": 0.2621987651862179,\n \"acc_norm_stderr\": 0.00438931274801215\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.026055296901152915,\n \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.026055296901152915\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n \"acc_stderr\": 0.029605623981771204,\n \"acc_norm\": 0.18497109826589594,\n \"acc_norm_stderr\": 0.029605623981771204\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.020842290930114683,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.020842290930114683\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.039325376803928724,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.039325376803928724\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.23548387096774193,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489614,\n \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489614\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817227,\n \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817227\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671549,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671549\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1834862385321101,\n \"acc_stderr\": 0.01659525971039932,\n \"acc_norm\": 0.1834862385321101,\n \"acc_norm_stderr\": 0.01659525971039932\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691933,\n \"acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691933\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083291,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083291\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.029872577708891155,\n \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.029872577708891155\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n \"acc_stderr\": 0.016073127851221246,\n \"acc_norm\": 0.280970625798212,\n \"acc_norm_stderr\": 0.016073127851221246\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713002,\n \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676644,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676644\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.02604066247420125,\n \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.02604066247420125\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146637,\n \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146637\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n \"acc_stderr\": 0.029475250236017183,\n \"acc_norm\": 0.22388059701492538,\n \"acc_norm_stderr\": 0.029475250236017183\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.4968235925779197,\n \"mc2_stderr\": 0.01657342542218468\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.0140519560640769\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Technoculture/mtor", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|arc:challenge|25_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|gsm8k|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hellaswag|10_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T20-28-47.252845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["**/details_harness|winogrande|5_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T20-28-47.252845.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T20_28_47.252845", "path": ["results_2024-02-13T20-28-47.252845.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T20-28-47.252845.parquet"]}]}]} | 2024-02-13T20:31:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Technoculture/mtor
Dataset automatically created during the evaluation run of model Technoculture/mtor on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T20:28:47.252845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Technoculture/mtor\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/mtor on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T20:28:47.252845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Technoculture/mtor\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/mtor on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T20:28:47.252845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
171,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Technoculture/mtor\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/mtor on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T20:28:47.252845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
7cf4b134273171a5979d593c7f90fe608e480add | # Dataset Card for "ShareMix-chatML"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | bcui19/ShareMix-chatML | [
"region:us"
] | 2024-02-13T20:47:26+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 152677714, "num_examples": 58483}], "download_size": 79748431, "dataset_size": 152677714}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-13T20:55:31+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ShareMix-chatML"
More Information needed | [
"# Dataset Card for \"ShareMix-chatML\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ShareMix-chatML\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ShareMix-chatML\"\n\nMore Information needed"
] |
2b287cda771723689af594fef64251c1ecd4b52a |
# Dataset Card for the LoWRA Bench Dataset
The ***Lo***RA ***W***eight ***R***ecovery ***A***ttack (LoWRA) Bench is a comprehensive
benchmark designed to evaluate Pre-Fine-Tuning (Pre-FT) weight recovery methods as presented
in the "Recovering the Pre-Fine-Tuning Weights of Generative Models" paper.
- [Task Details](#task-details)
- [Dataset Description](#dataset-description)
- [Dataset Structure](#dataset-structure)
- [Data Subsets](#data-subsets)
- [Data Fields](#data-fields)
- [Layer Merging Example](#layer-merging-example)
- [Dataset Creation](#dataset-creation)
- [Risks and Out-of-Scope Use](#risks-and-out-of-scope-use)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- **🌐 Homepage:**
https://vision.huji.ac.il/spectral_detuning/
- **🧑💻 Repository:**
https://github.com/eliahuhorwitz/Spectral-DeTuning
- **📃 Paper:**
https://arxiv.org/abs/2402.10208
- **✉️ Point of Contact:**
[email protected]
## Task Details
**Pre-Fine-Tuning Weight Recovery Attack Setting:** We uncover a vulnerability in LoRA fine-tuned models wherein an attacker is
able to undo the fine-tuning process and recover the weights of the original pre-trained model.
The setting for the vulnerability is as follows:
(a) The attacker only has access to n different LoRA fine-tuned models.
(b) The attacker assumes that all n models originated from the same source model.
(c) Using only the n visible models, the attacker attempts to recover the original source model.
**Note: The attacker has no access to the low-rank decomposition of the fine-tuned models.**
## Dataset Description
The LoWRA Bench dataset is designed to evaluate the performance of Pre-FT weight recovery methods.
The dataset encompasses three pre-trained representative source models:
1. A Vision Transformer (ViT) pre-trained on ImageNet-1K.
2. Mistral-7B-v0.1.
3. Stable Diffusion 1.5.
These models collectively cover supervised and self-supervised objectives, spanning both vision and
natural language processing (NLP) domains, as well as generative and discriminative tasks.
Notably, these models are widely used and deployed in numerous production systems.
For each source model, we curate 15 LoRA models fine-tuned on diverse datasets, tasks, and objectives.
The dataset comprises a diverse array of layer types, including self-attention, cross-attention,
and MLPs. This diversity enables us to assess the generalization capabilities of Pre-FT methods.
The evaluation can be conducted on a per-model basis, per layer type, or layer depth,
allowing for a comprehensive analysis of Pre-FT methods. Overall, our dataset includes 544 source
model layers. When taking into account the fine-tuned LoRA layers, the dataset includes over
8,000 layers.
## Dataset Structure
The dataset contains 4 subsets, for each subset we curate 15 LoRA fine-tuned models.
Each row of the dataset represents a single layer that should be recovered and contains all the needed information for the recovery and numerical evaluation.
In particular, for each layer, the dataset includes the original Pre-FT weights and the *unmerged* fine-tuned LoRA weight matrices.
We decided to provide the unmerged weights instead of the merged ones for two reasons:
1. Providing the unmerged weights significantly reduces the storage size of the dataset (e.g., for a single Mistral subset this reduces the size from ~100GB to ~8GB).
2. Providing the unmerged weights allows the dataset user to study the properties of the fine-tuned LoRA layers and may help when developing new methods.
We leave the merging of the layers to the user, keep in mind this should be done carefully and tested to ensure the original Pre-FT weights are not simply
provided to the method verbatim. See [Layer Merging Example ](#layer-merging-example) for an example taken from our GitHub repository.
### Data Subsets
The table below describes the dataset subsets in detail:
| Subset Name | Pre-FT Model | Task | Fine-tuning Task | # Pre-FT Layers | # Fine-tuned Layers |
|----------------------|----------------------|-------------------------------|------------------|-----------------|---------------------|
| vit | ViT | Image Classification | VTAB-1K | 24 | 360 |
| stable-diffusion-1.5 | Stable Diffusion 1.5 | Text-to-Image <br/>Generation | Personalization | 264 | 3960 |
| mistral-7b-v0.1-sft | Mistral-7B-v0.1 | Text Generation | UltraChat SFT | 128 | 1920 |
| mistral-7b-v0.1-dpo | Mistral-7B-v0.1 | Text Generation | UltraFeedback DPO| 128 | 1920 |
### Data Fields
As described above, each row of the dataset represents a single layer that should be recovered and contains the following fields:
task_name - The name of the task the model was fine-tuned on (subset).
layer_model - In some cases a Pre-FT model has more than one model (e.g., Stable Diffusion fine-tuned both
the UNet and the Text Encoder). This field specifies the model the layer belongs to.
layer_name - The name of the layer in the Pre-FT model as it appears in the model state_dict.
pre_ft_name - The name of the Pre-FT model (e.g., runwayml/stable-diffusion-v1-5).
pre_ft_weight - The weight matrix of the Pre-FT models layer.
lora_{lora_idx}_name - The name of the LoRA fine-tuned model.
lora_{lora_idx}_A_weight - The LoRA A weight matrix of the LoRA fine-tuned models layer.
lora_{lora_idx}_B_weight - The LoRA B weight matrix of the LoRA fine-tuned models layer.
lora_{lora_idx}_rank - The LoRA rank of the LoRA fine-tuned models layer.
lora_{lora_idx}_alpha - The LoRA alpha of the LoRA fine-tuned models layer.
where `{lora_idx}` is the index of the LoRA fine-tuned model in the subset (there are 15 LoRA models per subset).
### Layer Merging Example
The following code snippet demonstrates merging the LoRA fine-tuned weights with the Pre-FT weights.
```python
def merge_lora_weights(args, layer_idx, device):
dataset = load_dataset(args.dataset, name=args.subset, cache_dir=args.cache_dir)
layer = deepcopy(dataset.with_format("torch")["train"][layer_idx])
merged_layer = {}
# Note: load the ground truth Pre-FT weights
merged_layer['layer_model'] = layer['layer_model']
merged_layer['layer_name'] = layer['layer_name']
merged_layer['pre_ft_name'] = layer['pre_ft_name']
W_pre_ft = deepcopy(layer['pre_ft_weight']).to(device).float()
merged_layer['pre_ft_weight'] = deepcopy(W_pre_ft)
# Note: merge the LoRA weights for all existing LoRA models
for lora_idx in args.lora_ids:
alpha = layer[f'lora_{lora_idx}_alpha']
rank = layer[f'lora_{lora_idx}_rank']
B = deepcopy(layer[f'lora_{lora_idx}_B_weight']).to(device).float()
A = deepcopy(layer[f'lora_{lora_idx}_A_weight']).to(device).float()
merged_layer[f'lora_{lora_idx}_name'] = layer[f'lora_{lora_idx}_name']
merged_layer[f'lora_{lora_idx}_rank'] = rank
merged_layer[f'lora_{lora_idx}_alpha'] = alpha
merged_layer[f'lora_{lora_idx}_merged_weights'] = W_pre_ft + ((alpha / rank * B) @ A)
assert torch.allclose(merged_layer['pre_ft_weight'], layer['pre_ft_weight'])
assert not torch.allclose(merged_layer[f'lora_{lora_idx}_merged_weights'], layer['pre_ft_weight'])
assert not torch.allclose(merged_layer[f'lora_{lora_idx}_merged_weights'], merged_layer['pre_ft_weight'])
return merged_layer
```
## Dataset Creation
### Source Data
- The fine-tuning of the ViT models was performed using the [PEFT](https://huggingface.co/docs/peft/en/index) library
on various datasets from the [VTAB-1K](https://google-research.github.io/task_adaptation/) benchmark.
- The fine-tuned LoRA models for Stable Diffusion are taken from civitai and were fine-tuned by [RalFinger](https://civitai.com/user/RalFinger).
- The fine-tuning of Mistral was performed based on the Zephyr model as seen [here](https://github.com/huggingface/alignment-handbook/tree/main).
For the full list of models and hyper-parameters see the appendix of the [paper](https://arxiv.org/abs/2402.10208).
## Risks and Out-of-Scope Use
Our work uncovers a significant vulnerability in fine-tuned models, allowing attackers to
access pre-fine-tuning weights. While this discovery reveals potential security risks,
our primary objective is to advance the field of Machine Learning and raise awareness within the
research community about the existing vulnerabilities in current models.
Instead of using the findings of this study to execute attacks, we advocate for their use by
model creators to enhance the safety and security of their models. By acknowledging and
addressing vulnerabilities, creators can proactively safeguard against potential threats.
Following established practices in the cyber-security community, we emphasize the importance of open
discussion and encourage the reporting of vulnerabilities. By fostering transparency and collaboration,
we can collectively create a safer environment for deploying machine learning models.
## Considerations for Using the Data
### Licensing Information
[More Information Needed]
### Citation Information
If you use this dataset in your work please cite the following paper:
**BibTeX:**
[More Information Needed]
| Eliahu/LoWRA-Bench | [
"arxiv:2402.10208",
"region:us"
] | 2024-02-13T21:03:01+00:00 | {"pretty_name": "LoWRA-Bench", "dataset_info": [{"config_name": "mistral-7b-v0.1-dpo", "features": [{"name": "task_name", "dtype": "string"}, {"name": "layer_model", "dtype": "string"}, {"name": "layer_name", "dtype": "string"}, {"name": "pre_ft_name", "dtype": "string"}, {"name": "pre_ft_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_name", "dtype": "string"}, {"name": "lora_0_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_rank", "dtype": "int64"}, {"name": "lora_0_alpha", "dtype": "int64"}, {"name": "lora_1_name", "dtype": "string"}, {"name": "lora_1_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_1_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_1_rank", "dtype": "int64"}, {"name": "lora_1_alpha", "dtype": "int64"}, {"name": "lora_2_name", "dtype": "string"}, {"name": "lora_2_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_2_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_2_rank", "dtype": "int64"}, {"name": "lora_2_alpha", "dtype": "int64"}, {"name": "lora_3_name", "dtype": "string"}, {"name": "lora_3_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_3_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_3_rank", "dtype": "int64"}, {"name": "lora_3_alpha", "dtype": "int64"}, {"name": "lora_4_name", "dtype": "string"}, {"name": "lora_4_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_4_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_4_rank", "dtype": "int64"}, {"name": "lora_4_alpha", "dtype": "int64"}, {"name": "lora_5_name", "dtype": "string"}, {"name": "lora_5_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_5_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_5_rank", "dtype": "int64"}, {"name": "lora_5_alpha", "dtype": "int64"}, {"name": "lora_6_name", "dtype": "string"}, {"name": "lora_6_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_6_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_6_rank", "dtype": "int64"}, {"name": "lora_6_alpha", "dtype": "int64"}, {"name": "lora_7_name", "dtype": "string"}, {"name": "lora_7_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_7_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_7_rank", "dtype": "int64"}, {"name": "lora_7_alpha", "dtype": "int64"}, {"name": "lora_8_name", "dtype": "string"}, {"name": "lora_8_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_8_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_8_rank", "dtype": "int64"}, {"name": "lora_8_alpha", "dtype": "int64"}, {"name": "lora_9_name", "dtype": "string"}, {"name": "lora_9_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_9_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_9_rank", "dtype": "int64"}, {"name": "lora_9_alpha", "dtype": "int64"}, {"name": "lora_10_name", "dtype": "string"}, {"name": "lora_10_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_10_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_10_rank", "dtype": "int64"}, {"name": "lora_10_alpha", "dtype": "int64"}, {"name": "lora_11_name", "dtype": "string"}, {"name": "lora_11_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_11_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_11_rank", "dtype": "int64"}, {"name": "lora_11_alpha", "dtype": "int64"}, {"name": "lora_12_name", "dtype": "string"}, {"name": "lora_12_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_12_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_12_rank", "dtype": "int64"}, {"name": "lora_12_alpha", "dtype": "int64"}, {"name": "lora_13_name", "dtype": "string"}, {"name": "lora_13_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_13_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_13_rank", "dtype": "int64"}, {"name": "lora_13_alpha", "dtype": "int64"}, {"name": "lora_14_name", "dtype": "string"}, {"name": "lora_14_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_14_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_14_rank", "dtype": "int64"}, {"name": "lora_14_alpha", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 8661875544, "num_examples": 128}], "download_size": 3419054382, "dataset_size": 8661875544}, {"config_name": "mistral-7b-v0.1-sft", "features": [{"name": "task_name", "dtype": "string"}, {"name": "layer_model", "dtype": "string"}, {"name": "layer_name", "dtype": "string"}, {"name": "pre_ft_name", "dtype": "string"}, {"name": "pre_ft_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_name", "dtype": "string"}, {"name": "lora_0_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_rank", "dtype": "int64"}, {"name": "lora_0_alpha", "dtype": "int64"}, {"name": "lora_1_name", "dtype": "string"}, {"name": "lora_1_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_1_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_1_rank", "dtype": "int64"}, {"name": "lora_1_alpha", "dtype": "int64"}, {"name": "lora_2_name", "dtype": "string"}, {"name": "lora_2_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_2_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_2_rank", "dtype": "int64"}, {"name": "lora_2_alpha", "dtype": "int64"}, {"name": "lora_3_name", "dtype": "string"}, {"name": "lora_3_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_3_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_3_rank", "dtype": "int64"}, {"name": "lora_3_alpha", "dtype": "int64"}, {"name": "lora_4_name", "dtype": "string"}, {"name": "lora_4_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_4_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_4_rank", "dtype": "int64"}, {"name": "lora_4_alpha", "dtype": "int64"}, {"name": "lora_5_name", "dtype": "string"}, {"name": "lora_5_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_5_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_5_rank", "dtype": "int64"}, {"name": "lora_5_alpha", "dtype": "int64"}, {"name": "lora_6_name", "dtype": "string"}, {"name": "lora_6_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_6_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_6_rank", "dtype": "int64"}, {"name": "lora_6_alpha", "dtype": "int64"}, {"name": "lora_7_name", "dtype": "string"}, {"name": "lora_7_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_7_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_7_rank", "dtype": "int64"}, {"name": "lora_7_alpha", "dtype": "int64"}, {"name": "lora_8_name", "dtype": "string"}, {"name": "lora_8_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_8_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_8_rank", "dtype": "int64"}, {"name": "lora_8_alpha", "dtype": "int64"}, {"name": "lora_9_name", "dtype": "string"}, {"name": "lora_9_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_9_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_9_rank", "dtype": "int64"}, {"name": "lora_9_alpha", "dtype": "int64"}, {"name": "lora_10_name", "dtype": "string"}, {"name": "lora_10_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_10_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_10_rank", "dtype": "int64"}, {"name": "lora_10_alpha", "dtype": "int64"}, {"name": "lora_11_name", "dtype": "string"}, {"name": "lora_11_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_11_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_11_rank", "dtype": "int64"}, {"name": "lora_11_alpha", "dtype": "int64"}, {"name": "lora_12_name", "dtype": "string"}, {"name": "lora_12_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_12_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_12_rank", "dtype": "int64"}, {"name": "lora_12_alpha", "dtype": "int64"}, {"name": "lora_13_name", "dtype": "string"}, {"name": "lora_13_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_13_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_13_rank", "dtype": "int64"}, {"name": "lora_13_alpha", "dtype": "int64"}, {"name": "lora_14_name", "dtype": "string"}, {"name": "lora_14_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_14_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_14_rank", "dtype": "int64"}, {"name": "lora_14_alpha", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 8661875544, "num_examples": 128}], "download_size": 5791365905, "dataset_size": 8661875544}, {"config_name": "stable-diffusion-1.5", "features": [{"name": "task_name", "dtype": "string"}, {"name": "layer_model", "dtype": "string"}, {"name": "layer_name", "dtype": "string"}, {"name": "pre_ft_name", "dtype": "string"}, {"name": "pre_ft_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_name", "dtype": "string"}, {"name": "lora_0_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_rank", "dtype": "int64"}, {"name": "lora_0_alpha", "dtype": "float64"}, {"name": "lora_1_name", "dtype": "string"}, {"name": "lora_1_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_1_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_1_rank", "dtype": "int64"}, {"name": "lora_1_alpha", "dtype": "float64"}, {"name": "lora_2_name", "dtype": "string"}, {"name": "lora_2_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_2_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_2_rank", "dtype": "int64"}, {"name": "lora_2_alpha", "dtype": "float64"}, {"name": "lora_3_name", "dtype": "string"}, {"name": "lora_3_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_3_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_3_rank", "dtype": "int64"}, {"name": "lora_3_alpha", "dtype": "float64"}, {"name": "lora_4_name", "dtype": "string"}, {"name": "lora_4_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_4_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_4_rank", "dtype": "int64"}, {"name": "lora_4_alpha", "dtype": "float64"}, {"name": "lora_5_name", "dtype": "string"}, {"name": "lora_5_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_5_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_5_rank", "dtype": "int64"}, {"name": "lora_5_alpha", "dtype": "float64"}, {"name": "lora_6_name", "dtype": "string"}, {"name": "lora_6_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_6_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_6_rank", "dtype": "int64"}, {"name": "lora_6_alpha", "dtype": "float64"}, {"name": "lora_7_name", "dtype": "string"}, {"name": "lora_7_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_7_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_7_rank", "dtype": "int64"}, {"name": "lora_7_alpha", "dtype": "float64"}, {"name": "lora_8_name", "dtype": "string"}, {"name": "lora_8_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_8_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_8_rank", "dtype": "int64"}, {"name": "lora_8_alpha", "dtype": "float64"}, {"name": "lora_9_name", "dtype": "string"}, {"name": "lora_9_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_9_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_9_rank", "dtype": "int64"}, {"name": "lora_9_alpha", "dtype": "float64"}, {"name": "lora_10_name", "dtype": "string"}, {"name": "lora_10_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_10_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_10_rank", "dtype": "int64"}, {"name": "lora_10_alpha", "dtype": "float64"}, {"name": "lora_11_name", "dtype": "string"}, {"name": "lora_11_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_11_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_11_rank", "dtype": "int64"}, {"name": "lora_11_alpha", "dtype": "float64"}, {"name": "lora_12_name", "dtype": "string"}, {"name": "lora_12_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_12_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_12_rank", "dtype": "int64"}, {"name": "lora_12_alpha", "dtype": "float64"}, {"name": "lora_13_name", "dtype": "string"}, {"name": "lora_13_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_13_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_13_rank", "dtype": "int64"}, {"name": "lora_13_alpha", "dtype": "float64"}, {"name": "lora_14_name", "dtype": "string"}, {"name": "lora_14_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_14_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_14_rank", "dtype": "int64"}, {"name": "lora_14_alpha", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 2561357508, "num_examples": 264}], "download_size": 1724766354, "dataset_size": 2561357508}, {"config_name": "vit", "features": [{"name": "task_name", "dtype": "string"}, {"name": "layer_model", "dtype": "string"}, {"name": "layer_name", "dtype": "string"}, {"name": "pre_ft_name", "dtype": "string"}, {"name": "pre_ft_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_name", "dtype": "string"}, {"name": "lora_0_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_0_rank", "dtype": "int64"}, {"name": "lora_0_alpha", "dtype": "int64"}, {"name": "lora_1_name", "dtype": "string"}, {"name": "lora_1_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_1_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_1_rank", "dtype": "int64"}, {"name": "lora_1_alpha", "dtype": "int64"}, {"name": "lora_2_name", "dtype": "string"}, {"name": "lora_2_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_2_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_2_rank", "dtype": "int64"}, {"name": "lora_2_alpha", "dtype": "int64"}, {"name": "lora_3_name", "dtype": "string"}, {"name": "lora_3_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_3_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_3_rank", "dtype": "int64"}, {"name": "lora_3_alpha", "dtype": "int64"}, {"name": "lora_4_name", "dtype": "string"}, {"name": "lora_4_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_4_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_4_rank", "dtype": "int64"}, {"name": "lora_4_alpha", "dtype": "int64"}, {"name": "lora_5_name", "dtype": "string"}, {"name": "lora_5_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_5_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_5_rank", "dtype": "int64"}, {"name": "lora_5_alpha", "dtype": "int64"}, {"name": "lora_6_name", "dtype": "string"}, {"name": "lora_6_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_6_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_6_rank", "dtype": "int64"}, {"name": "lora_6_alpha", "dtype": "int64"}, {"name": "lora_7_name", "dtype": "string"}, {"name": "lora_7_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_7_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_7_rank", "dtype": "int64"}, {"name": "lora_7_alpha", "dtype": "int64"}, {"name": "lora_8_name", "dtype": "string"}, {"name": "lora_8_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_8_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_8_rank", "dtype": "int64"}, {"name": "lora_8_alpha", "dtype": "int64"}, {"name": "lora_9_name", "dtype": "string"}, {"name": "lora_9_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_9_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_9_rank", "dtype": "int64"}, {"name": "lora_9_alpha", "dtype": "int64"}, {"name": "lora_10_name", "dtype": "string"}, {"name": "lora_10_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_10_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_10_rank", "dtype": "int64"}, {"name": "lora_10_alpha", "dtype": "int64"}, {"name": "lora_11_name", "dtype": "string"}, {"name": "lora_11_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_11_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_11_rank", "dtype": "int64"}, {"name": "lora_11_alpha", "dtype": "int64"}, {"name": "lora_12_name", "dtype": "string"}, {"name": "lora_12_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_12_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_12_rank", "dtype": "int64"}, {"name": "lora_12_alpha", "dtype": "int64"}, {"name": "lora_13_name", "dtype": "string"}, {"name": "lora_13_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_13_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_13_rank", "dtype": "int64"}, {"name": "lora_13_alpha", "dtype": "int64"}, {"name": "lora_14_name", "dtype": "string"}, {"name": "lora_14_A_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_14_B_weight", "sequence": {"sequence": "float32"}}, {"name": "lora_14_rank", "dtype": "int64"}, {"name": "lora_14_alpha", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 93231628, "num_examples": 24}], "download_size": 111481540, "dataset_size": 93231628}], "configs": [{"config_name": "mistral-7b-v0.1-dpo", "data_files": [{"split": "train", "path": "mistral-7b-v0.1-dpo/train-*"}]}, {"config_name": "mistral-7b-v0.1-sft", "data_files": [{"split": "train", "path": "mistral-7b-v0.1-sft/train-*"}]}, {"config_name": "stable-diffusion-1.5", "data_files": [{"split": "train", "path": "stable-diffusion-1.5/train-*"}]}, {"config_name": "vit", "data_files": [{"split": "train", "path": "vit/train-*"}]}]} | 2024-02-16T07:41:59+00:00 | [
"2402.10208"
] | [] | TAGS
#arxiv-2402.10208 #region-us
| Dataset Card for the LoWRA Bench Dataset
========================================
The *Lo*RA *W*eight *R*ecovery *A*ttack (LoWRA) Bench is a comprehensive
benchmark designed to evaluate Pre-Fine-Tuning (Pre-FT) weight recovery methods as presented
in the "Recovering the Pre-Fine-Tuning Weights of Generative Models" paper.
* Task Details
* Dataset Description
* Dataset Structure
+ Data Subsets
+ Data Fields
+ Layer Merging Example
* Dataset Creation
* Risks and Out-of-Scope Use
* Considerations for Using the Data
+ Licensing Information
+ Citation Information
* Homepage:
URL
* Repository:
URL
* Paper:
URL
* ️ Point of Contact:
eliahu.horwitz@URL
Task Details
------------
Pre-Fine-Tuning Weight Recovery Attack Setting: We uncover a vulnerability in LoRA fine-tuned models wherein an attacker is
able to undo the fine-tuning process and recover the weights of the original pre-trained model.
The setting for the vulnerability is as follows:
(a) The attacker only has access to n different LoRA fine-tuned models.
(b) The attacker assumes that all n models originated from the same source model.
(c) Using only the n visible models, the attacker attempts to recover the original source model.
Note: The attacker has no access to the low-rank decomposition of the fine-tuned models.
Dataset Description
-------------------
The LoWRA Bench dataset is designed to evaluate the performance of Pre-FT weight recovery methods.
The dataset encompasses three pre-trained representative source models:
1. A Vision Transformer (ViT) pre-trained on ImageNet-1K.
2. Mistral-7B-v0.1.
3. Stable Diffusion 1.5.
These models collectively cover supervised and self-supervised objectives, spanning both vision and
natural language processing (NLP) domains, as well as generative and discriminative tasks.
Notably, these models are widely used and deployed in numerous production systems.
For each source model, we curate 15 LoRA models fine-tuned on diverse datasets, tasks, and objectives.
The dataset comprises a diverse array of layer types, including self-attention, cross-attention,
and MLPs. This diversity enables us to assess the generalization capabilities of Pre-FT methods.
The evaluation can be conducted on a per-model basis, per layer type, or layer depth,
allowing for a comprehensive analysis of Pre-FT methods. Overall, our dataset includes 544 source
model layers. When taking into account the fine-tuned LoRA layers, the dataset includes over
8,000 layers.
Dataset Structure
-----------------
The dataset contains 4 subsets, for each subset we curate 15 LoRA fine-tuned models.
Each row of the dataset represents a single layer that should be recovered and contains all the needed information for the recovery and numerical evaluation.
In particular, for each layer, the dataset includes the original Pre-FT weights and the *unmerged* fine-tuned LoRA weight matrices.
We decided to provide the unmerged weights instead of the merged ones for two reasons:
1. Providing the unmerged weights significantly reduces the storage size of the dataset (e.g., for a single Mistral subset this reduces the size from ~100GB to ~8GB).
2. Providing the unmerged weights allows the dataset user to study the properties of the fine-tuned LoRA layers and may help when developing new methods.
We leave the merging of the layers to the user, keep in mind this should be done carefully and tested to ensure the original Pre-FT weights are not simply
provided to the method verbatim. See Layer Merging Example for an example taken from our GitHub repository.
### Data Subsets
The table below describes the dataset subsets in detail:
### Data Fields
As described above, each row of the dataset represents a single layer that should be recovered and contains the following fields:
```
task_name - The name of the task the model was fine-tuned on (subset).
layer_model - In some cases a Pre-FT model has more than one model (e.g., Stable Diffusion fine-tuned both
the UNet and the Text Encoder). This field specifies the model the layer belongs to.
layer_name - The name of the layer in the Pre-FT model as it appears in the model state_dict.
pre_ft_name - The name of the Pre-FT model (e.g., runwayml/stable-diffusion-v1-5).
pre_ft_weight - The weight matrix of the Pre-FT models layer.
lora_{lora_idx}_name - The name of the LoRA fine-tuned model.
lora_{lora_idx}_A_weight - The LoRA A weight matrix of the LoRA fine-tuned models layer.
lora_{lora_idx}_B_weight - The LoRA B weight matrix of the LoRA fine-tuned models layer.
lora_{lora_idx}_rank - The LoRA rank of the LoRA fine-tuned models layer.
lora_{lora_idx}_alpha - The LoRA alpha of the LoRA fine-tuned models layer.
```
where '{lora\_idx}' is the index of the LoRA fine-tuned model in the subset (there are 15 LoRA models per subset).
### Layer Merging Example
The following code snippet demonstrates merging the LoRA fine-tuned weights with the Pre-FT weights.
Dataset Creation
----------------
### Source Data
* The fine-tuning of the ViT models was performed using the PEFT library
on various datasets from the VTAB-1K benchmark.
* The fine-tuned LoRA models for Stable Diffusion are taken from civitai and were fine-tuned by RalFinger.
* The fine-tuning of Mistral was performed based on the Zephyr model as seen here.
For the full list of models and hyper-parameters see the appendix of the paper.
Risks and Out-of-Scope Use
--------------------------
Our work uncovers a significant vulnerability in fine-tuned models, allowing attackers to
access pre-fine-tuning weights. While this discovery reveals potential security risks,
our primary objective is to advance the field of Machine Learning and raise awareness within the
research community about the existing vulnerabilities in current models.
Instead of using the findings of this study to execute attacks, we advocate for their use by
model creators to enhance the safety and security of their models. By acknowledging and
addressing vulnerabilities, creators can proactively safeguard against potential threats.
Following established practices in the cyber-security community, we emphasize the importance of open
discussion and encourage the reporting of vulnerabilities. By fostering transparency and collaboration,
we can collectively create a safer environment for deploying machine learning models.
Considerations for Using the Data
---------------------------------
### Licensing Information
If you use this dataset in your work please cite the following paper:
BibTeX:
| [
"### Data Subsets\n\n\nThe table below describes the dataset subsets in detail:",
"### Data Fields\n\n\nAs described above, each row of the dataset represents a single layer that should be recovered and contains the following fields:\n\n\n\n```\ntask_name - The name of the task the model was fine-tuned on (subset).\nlayer_model - In some cases a Pre-FT model has more than one model (e.g., Stable Diffusion fine-tuned both \n the UNet and the Text Encoder). This field specifies the model the layer belongs to.\nlayer_name - The name of the layer in the Pre-FT model as it appears in the model state_dict.\npre_ft_name - The name of the Pre-FT model (e.g., runwayml/stable-diffusion-v1-5).\npre_ft_weight - The weight matrix of the Pre-FT models layer. \nlora_{lora_idx}_name - The name of the LoRA fine-tuned model.\nlora_{lora_idx}_A_weight - The LoRA A weight matrix of the LoRA fine-tuned models layer.\nlora_{lora_idx}_B_weight - The LoRA B weight matrix of the LoRA fine-tuned models layer.\nlora_{lora_idx}_rank - The LoRA rank of the LoRA fine-tuned models layer.\nlora_{lora_idx}_alpha - The LoRA alpha of the LoRA fine-tuned models layer.\n\n```\n\nwhere '{lora\\_idx}' is the index of the LoRA fine-tuned model in the subset (there are 15 LoRA models per subset).",
"### Layer Merging Example\n\n\nThe following code snippet demonstrates merging the LoRA fine-tuned weights with the Pre-FT weights.\n\n\nDataset Creation\n----------------",
"### Source Data\n\n\n* The fine-tuning of the ViT models was performed using the PEFT library\non various datasets from the VTAB-1K benchmark.\n* The fine-tuned LoRA models for Stable Diffusion are taken from civitai and were fine-tuned by RalFinger.\n* The fine-tuning of Mistral was performed based on the Zephyr model as seen here.\n\n\nFor the full list of models and hyper-parameters see the appendix of the paper.\n\n\nRisks and Out-of-Scope Use\n--------------------------\n\n\nOur work uncovers a significant vulnerability in fine-tuned models, allowing attackers to\naccess pre-fine-tuning weights. While this discovery reveals potential security risks,\nour primary objective is to advance the field of Machine Learning and raise awareness within the\nresearch community about the existing vulnerabilities in current models.\n\n\nInstead of using the findings of this study to execute attacks, we advocate for their use by\nmodel creators to enhance the safety and security of their models. By acknowledging and\naddressing vulnerabilities, creators can proactively safeguard against potential threats.\n\n\nFollowing established practices in the cyber-security community, we emphasize the importance of open\ndiscussion and encourage the reporting of vulnerabilities. By fostering transparency and collaboration,\nwe can collectively create a safer environment for deploying machine learning models.\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Licensing Information\n\n\nIf you use this dataset in your work please cite the following paper:\n\n\nBibTeX:"
] | [
"TAGS\n#arxiv-2402.10208 #region-us \n",
"### Data Subsets\n\n\nThe table below describes the dataset subsets in detail:",
"### Data Fields\n\n\nAs described above, each row of the dataset represents a single layer that should be recovered and contains the following fields:\n\n\n\n```\ntask_name - The name of the task the model was fine-tuned on (subset).\nlayer_model - In some cases a Pre-FT model has more than one model (e.g., Stable Diffusion fine-tuned both \n the UNet and the Text Encoder). This field specifies the model the layer belongs to.\nlayer_name - The name of the layer in the Pre-FT model as it appears in the model state_dict.\npre_ft_name - The name of the Pre-FT model (e.g., runwayml/stable-diffusion-v1-5).\npre_ft_weight - The weight matrix of the Pre-FT models layer. \nlora_{lora_idx}_name - The name of the LoRA fine-tuned model.\nlora_{lora_idx}_A_weight - The LoRA A weight matrix of the LoRA fine-tuned models layer.\nlora_{lora_idx}_B_weight - The LoRA B weight matrix of the LoRA fine-tuned models layer.\nlora_{lora_idx}_rank - The LoRA rank of the LoRA fine-tuned models layer.\nlora_{lora_idx}_alpha - The LoRA alpha of the LoRA fine-tuned models layer.\n\n```\n\nwhere '{lora\\_idx}' is the index of the LoRA fine-tuned model in the subset (there are 15 LoRA models per subset).",
"### Layer Merging Example\n\n\nThe following code snippet demonstrates merging the LoRA fine-tuned weights with the Pre-FT weights.\n\n\nDataset Creation\n----------------",
"### Source Data\n\n\n* The fine-tuning of the ViT models was performed using the PEFT library\non various datasets from the VTAB-1K benchmark.\n* The fine-tuned LoRA models for Stable Diffusion are taken from civitai and were fine-tuned by RalFinger.\n* The fine-tuning of Mistral was performed based on the Zephyr model as seen here.\n\n\nFor the full list of models and hyper-parameters see the appendix of the paper.\n\n\nRisks and Out-of-Scope Use\n--------------------------\n\n\nOur work uncovers a significant vulnerability in fine-tuned models, allowing attackers to\naccess pre-fine-tuning weights. While this discovery reveals potential security risks,\nour primary objective is to advance the field of Machine Learning and raise awareness within the\nresearch community about the existing vulnerabilities in current models.\n\n\nInstead of using the findings of this study to execute attacks, we advocate for their use by\nmodel creators to enhance the safety and security of their models. By acknowledging and\naddressing vulnerabilities, creators can proactively safeguard against potential threats.\n\n\nFollowing established practices in the cyber-security community, we emphasize the importance of open\ndiscussion and encourage the reporting of vulnerabilities. By fostering transparency and collaboration,\nwe can collectively create a safer environment for deploying machine learning models.\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Licensing Information\n\n\nIf you use this dataset in your work please cite the following paper:\n\n\nBibTeX:"
] | [
14,
20,
381,
41,
321,
26
] | [
"passage: TAGS\n#arxiv-2402.10208 #region-us \n### Data Subsets\n\n\nThe table below describes the dataset subsets in detail:### Data Fields\n\n\nAs described above, each row of the dataset represents a single layer that should be recovered and contains the following fields:\n\n\n\n```\ntask_name - The name of the task the model was fine-tuned on (subset).\nlayer_model - In some cases a Pre-FT model has more than one model (e.g., Stable Diffusion fine-tuned both \n the UNet and the Text Encoder). This field specifies the model the layer belongs to.\nlayer_name - The name of the layer in the Pre-FT model as it appears in the model state_dict.\npre_ft_name - The name of the Pre-FT model (e.g., runwayml/stable-diffusion-v1-5).\npre_ft_weight - The weight matrix of the Pre-FT models layer. \nlora_{lora_idx}_name - The name of the LoRA fine-tuned model.\nlora_{lora_idx}_A_weight - The LoRA A weight matrix of the LoRA fine-tuned models layer.\nlora_{lora_idx}_B_weight - The LoRA B weight matrix of the LoRA fine-tuned models layer.\nlora_{lora_idx}_rank - The LoRA rank of the LoRA fine-tuned models layer.\nlora_{lora_idx}_alpha - The LoRA alpha of the LoRA fine-tuned models layer.\n\n```\n\nwhere '{lora\\_idx}' is the index of the LoRA fine-tuned model in the subset (there are 15 LoRA models per subset).### Layer Merging Example\n\n\nThe following code snippet demonstrates merging the LoRA fine-tuned weights with the Pre-FT weights.\n\n\nDataset Creation\n----------------"
] |
f78c9600697904af9f13daa23933f1047983a5d0 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
Tokenizer: mdeberta
Dataset: GZ-GOLD-NER-ALIGN_105
Unshuffled ratio: 1
Shuffled ratio: 0
Drop duplicates: False
Dataset path = /home/pgajo/working/food/data/GZ/GZ-GOLD/GZ-GOLD-NER-ALIGN_105.json
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | pgajo/mdeberta_GZ-GOLD-NER-ALIGN_105_U1_S0_DROP0_types | [
"region:us"
] | 2024-02-13T21:09:06+00:00 | {} | 2024-02-13T21:09:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
Tokenizer: mdeberta
Dataset: GZ-GOLD-NER-ALIGN_105
Unshuffled ratio: 1
Shuffled ratio: 0
Drop duplicates: False
Dataset path = /home/pgajo/working/food/data/GZ/GZ-GOLD/GZ-GOLD-NER-ALIGN_105.json
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: GZ-GOLD-NER-ALIGN_105\n\n Unshuffled ratio: 1\n\n Shuffled ratio: 0\n\n Drop duplicates: False\n\n Dataset path = /home/pgajo/working/food/data/GZ/GZ-GOLD/GZ-GOLD-NER-ALIGN_105.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: GZ-GOLD-NER-ALIGN_105\n\n Unshuffled ratio: 1\n\n Shuffled ratio: 0\n\n Drop duplicates: False\n\n Dataset path = /home/pgajo/working/food/data/GZ/GZ-GOLD/GZ-GOLD-NER-ALIGN_105.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
91,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: GZ-GOLD-NER-ALIGN_105\n\n Unshuffled ratio: 1\n\n Shuffled ratio: 0\n\n Drop duplicates: False\n\n Dataset path = /home/pgajo/working/food/data/GZ/GZ-GOLD/GZ-GOLD-NER-ALIGN_105.json## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
95230225b82c5581f4aed1c395aeea0d4fd3382f |
Synthetic dataset sampled from GPT-3.5 Turbo 0125:
```python
PROMPT = "Write a Socratic dialog, in which the User begins, between \"Assistant\" and his \"User\" about: "
def fetch_dialog_with_retry():
for attempt in range(RETRY_ATTEMPTS):
try:
response = client.chat.completions.create(
model="gpt-3.5-turbo-0125",
messages=[{"role": "user", "content": PROMPT + random.choice(extended_general_knowledge)}],
temperature=1.0,
max_tokens=2048,
timeout=API_TIMEOUT)
dialog = response.choices[0].message.content.strip()
# String replacements
dialog = dialog.replace("User: ", "[USER]: ")
dialog = dialog.replace("Assistant: ", "[ASSISTANT]: ")
dialog = dialog.replace("\n\n", "\n")
return dialog
except Exception as e:
print(f"Error fetching dialog (Attempt {attempt + 1}/{RETRY_ATTEMPTS}): {e}")
print(f"Cooling down for {attempt + 2} seconds before retrying...")
time.sleep(attempt + 1)
return None
``` | markusheimerl/socratic_dialogs | [
"license:apache-2.0",
"region:us"
] | 2024-02-13T21:38:35+00:00 | {"license": "apache-2.0"} | 2024-02-14T12:48:57+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
Synthetic dataset sampled from GPT-3.5 Turbo 0125:
| [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] | [
14
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |
1eaeb60c75e6c8af06d2a3990ec0429a0e6da419 |
# Dataset Card for Evaluation run of mlabonne/Monarch-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/Monarch-7B](https://huggingface.co/mlabonne/Monarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__Monarch-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T21:50:30.154125](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Monarch-7B/blob/main/results_2024-02-13T21-50-30.154125.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6503470541775671,
"acc_stderr": 0.03225506593829634,
"acc_norm": 0.6497458953786943,
"acc_norm_stderr": 0.0329312943358108,
"mc1": 0.6266829865361077,
"mc1_stderr": 0.016932370557570638,
"mc2": 0.7734559241701873,
"mc2_stderr": 0.013864158411608155
},
"harness|arc:challenge|25": {
"acc": 0.7056313993174061,
"acc_stderr": 0.013318528460539419,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869143
},
"harness|hellaswag|10": {
"acc": 0.7143995220075682,
"acc_stderr": 0.0045077680295900965,
"acc_norm": 0.8902609042023502,
"acc_norm_stderr": 0.003119254828848945
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799215,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799215
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038913,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038913
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6266829865361077,
"mc1_stderr": 0.016932370557570638,
"mc2": 0.7734559241701873,
"mc2_stderr": 0.013864158411608155
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750038
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078146
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mlabonne__Monarch-7B | [
"region:us"
] | 2024-02-13T21:52:49+00:00 | {"pretty_name": "Evaluation run of mlabonne/Monarch-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [mlabonne/Monarch-7B](https://huggingface.co/mlabonne/Monarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__Monarch-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T21:50:30.154125](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Monarch-7B/blob/main/results_2024-02-13T21-50-30.154125.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6503470541775671,\n \"acc_stderr\": 0.03225506593829634,\n \"acc_norm\": 0.6497458953786943,\n \"acc_norm_stderr\": 0.0329312943358108,\n \"mc1\": 0.6266829865361077,\n \"mc1_stderr\": 0.016932370557570638,\n \"mc2\": 0.7734559241701873,\n \"mc2_stderr\": 0.013864158411608155\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.013318528460539419,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869143\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7143995220075682,\n \"acc_stderr\": 0.0045077680295900965,\n \"acc_norm\": 0.8902609042023502,\n \"acc_norm_stderr\": 0.003119254828848945\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799215,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799215\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.012758410941038913,\n \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.012758410941038913\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6266829865361077,\n \"mc1_stderr\": 0.016932370557570638,\n \"mc2\": 0.7734559241701873,\n \"mc2_stderr\": 0.013864158411608155\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750038\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \"acc_stderr\": 0.012731710925078146\n }\n}\n```", "repo_url": "https://huggingface.co/mlabonne/Monarch-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|arc:challenge|25_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|gsm8k|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hellaswag|10_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T21-50-30.154125.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["**/details_harness|winogrande|5_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T21-50-30.154125.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T21_50_30.154125", "path": ["results_2024-02-13T21-50-30.154125.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T21-50-30.154125.parquet"]}]}]} | 2024-02-13T21:53:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mlabonne/Monarch-7B
Dataset automatically created during the evaluation run of model mlabonne/Monarch-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T21:50:30.154125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mlabonne/Monarch-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/Monarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T21:50:30.154125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mlabonne/Monarch-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/Monarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T21:50:30.154125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mlabonne/Monarch-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/Monarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T21:50:30.154125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
54e87752ce394606f34cf3526e58ba177a099e46 |
# Dataset Card for Evaluation run of RaduGabriel/MUZ
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaduGabriel/MUZ](https://huggingface.co/RaduGabriel/MUZ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaduGabriel__MUZ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T22:13:46.441981](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__MUZ/blob/main/results_2024-02-13T22-13-46.441981.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6329065706688936,
"acc_stderr": 0.03256288715823963,
"acc_norm": 0.6348082926210658,
"acc_norm_stderr": 0.03322356042730485,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6418307608805432,
"mc2_stderr": 0.014958982283642843
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205765
},
"harness|hellaswag|10": {
"acc": 0.6695877315275841,
"acc_stderr": 0.004694002781939567,
"acc_norm": 0.8637721569408484,
"acc_norm_stderr": 0.0034232928816321524
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902796,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848026,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848026
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40670391061452515,
"acc_stderr": 0.016428811915898865,
"acc_norm": 0.40670391061452515,
"acc_norm_stderr": 0.016428811915898865
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053738,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053738
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.01272978538659856,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.01272978538659856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.02888819310398863,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.02888819310398863
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.03437519337338252,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.03437519337338252
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6418307608805432,
"mc2_stderr": 0.014958982283642843
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267198
},
"harness|gsm8k|5": {
"acc": 0.5830174374526156,
"acc_stderr": 0.013581320997216586
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RaduGabriel__MUZ | [
"region:us"
] | 2024-02-13T22:16:04+00:00 | {"pretty_name": "Evaluation run of RaduGabriel/MUZ", "dataset_summary": "Dataset automatically created during the evaluation run of model [RaduGabriel/MUZ](https://huggingface.co/RaduGabriel/MUZ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaduGabriel__MUZ\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T22:13:46.441981](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__MUZ/blob/main/results_2024-02-13T22-13-46.441981.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6329065706688936,\n \"acc_stderr\": 0.03256288715823963,\n \"acc_norm\": 0.6348082926210658,\n \"acc_norm_stderr\": 0.03322356042730485,\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6418307608805432,\n \"mc2_stderr\": 0.014958982283642843\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205765\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6695877315275841,\n \"acc_stderr\": 0.004694002781939567,\n \"acc_norm\": 0.8637721569408484,\n \"acc_norm_stderr\": 0.0034232928816321524\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544074,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544074\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\": 0.667741935483871,\n \"acc_norm_stderr\": 0.0267955608481228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902796,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848026,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848026\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n \"acc_stderr\": 0.016428811915898865,\n \"acc_norm\": 0.40670391061452515,\n \"acc_norm_stderr\": 0.016428811915898865\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053738,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053738\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.01272978538659856,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.01272978538659856\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.02888819310398863,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.02888819310398863\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n \"acc_stderr\": 0.03437519337338252,\n \"acc_norm\": 0.6169154228855721,\n \"acc_norm_stderr\": 0.03437519337338252\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6418307608805432,\n \"mc2_stderr\": 0.014958982283642843\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267198\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5830174374526156,\n \"acc_stderr\": 0.013581320997216586\n }\n}\n```", "repo_url": "https://huggingface.co/RaduGabriel/MUZ", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|arc:challenge|25_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|gsm8k|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hellaswag|10_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T22-13-46.441981.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["**/details_harness|winogrande|5_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T22-13-46.441981.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T22_13_46.441981", "path": ["results_2024-02-13T22-13-46.441981.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T22-13-46.441981.parquet"]}]}]} | 2024-02-13T22:16:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of RaduGabriel/MUZ
Dataset automatically created during the evaluation run of model RaduGabriel/MUZ on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T22:13:46.441981(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of RaduGabriel/MUZ\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/MUZ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T22:13:46.441981(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of RaduGabriel/MUZ\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/MUZ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T22:13:46.441981(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of RaduGabriel/MUZ\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/MUZ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T22:13:46.441981(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
36f76c7040e9597a1710a0dc27951b4bbb98262b | # Dataset Card for "OpenHermes-AR-300K.csv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | medmac01/OpenHermes-AR-300K | [
"region:us"
] | 2024-02-13T22:52:35+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "model_name", "dtype": "string"}, {"name": "custom_instruction", "dtype": "bool"}, {"name": "idx", "dtype": "float64"}, {"name": "topic", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "conversations", "dtype": "string"}, {"name": "system_prompt", "dtype": "string"}, {"name": "avatarUrl", "dtype": "float64"}, {"name": "hash", "dtype": "float64"}, {"name": "category", "dtype": "float64"}, {"name": "id", "dtype": "string"}, {"name": "model", "dtype": "float64"}, {"name": "views", "dtype": "float64"}, {"name": "skip_prompt_formatting", "dtype": "float64"}, {"name": "title", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 606005587, "num_examples": 300022}], "download_size": 249268422, "dataset_size": 606005587}} | 2024-02-13T23:01:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "URL"
More Information needed | [
"# Dataset Card for \"URL\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"URL\"\n\nMore Information needed"
] | [
6,
11
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"URL\"\n\nMore Information needed"
] |
f516575da44644ee6000e76ea5cd05fff3339ee8 |
# Dataset Card for Evaluation run of Xenon1/Zenith-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Zenith-7B](https://huggingface.co/Xenon1/Zenith-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Zenith-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T23:51:23.458204](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B/blob/main/results_2024-02-13T23-51-23.458204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6060569181007116,
"acc_stderr": 0.03295931159418492,
"acc_norm": 0.6155112782453724,
"acc_norm_stderr": 0.03370868110594653,
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5575741417196965,
"mc2_stderr": 0.015266653264473424
},
"harness|arc:challenge|25": {
"acc": 0.515358361774744,
"acc_stderr": 0.014604496129394908,
"acc_norm": 0.5631399317406144,
"acc_norm_stderr": 0.014494421584256532
},
"harness|hellaswag|10": {
"acc": 0.6144194383588927,
"acc_stderr": 0.004857374133246894,
"acc_norm": 0.8110934076877117,
"acc_norm_stderr": 0.003906344213756631
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797612,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797612
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.025822106119415895,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.025822106119415895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399303,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835795,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835795
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092365,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0251901813276084,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0251901813276084
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32737430167597764,
"acc_stderr": 0.015694238967737386,
"acc_norm": 0.32737430167597764,
"acc_norm_stderr": 0.015694238967737386
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632938,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632938
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41199478487614083,
"acc_stderr": 0.012570871032146077,
"acc_norm": 0.41199478487614083,
"acc_norm_stderr": 0.012570871032146077
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281508,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5575741417196965,
"mc2_stderr": 0.015266653264473424
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497811
},
"harness|gsm8k|5": {
"acc": 0.12054586808188021,
"acc_stderr": 0.008968608285309088
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Xenon1__Zenith-7B | [
"region:us"
] | 2024-02-13T23:53:39+00:00 | {"pretty_name": "Evaluation run of Xenon1/Zenith-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/Zenith-7B](https://huggingface.co/Xenon1/Zenith-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Zenith-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T23:51:23.458204](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B/blob/main/results_2024-02-13T23-51-23.458204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6060569181007116,\n \"acc_stderr\": 0.03295931159418492,\n \"acc_norm\": 0.6155112782453724,\n \"acc_norm_stderr\": 0.03370868110594653,\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5575741417196965,\n \"mc2_stderr\": 0.015266653264473424\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.515358361774744,\n \"acc_stderr\": 0.014604496129394908,\n \"acc_norm\": 0.5631399317406144,\n \"acc_norm_stderr\": 0.014494421584256532\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6144194383588927,\n \"acc_stderr\": 0.004857374133246894,\n \"acc_norm\": 0.8110934076877117,\n \"acc_norm_stderr\": 0.003906344213756631\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797612,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797612\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266237,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266237\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n \"acc_stderr\": 0.025822106119415895,\n \"acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.025822106119415895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399303,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835795,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835795\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092365,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0251901813276084,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0251901813276084\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32737430167597764,\n \"acc_stderr\": 0.015694238967737386,\n \"acc_norm\": 0.32737430167597764,\n \"acc_norm_stderr\": 0.015694238967737386\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632938,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632938\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41199478487614083,\n \"acc_stderr\": 0.012570871032146077,\n \"acc_norm\": 0.41199478487614083,\n \"acc_norm_stderr\": 0.012570871032146077\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281508,\n \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281508\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.7661691542288557,\n \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5575741417196965,\n \"mc2_stderr\": 0.015266653264473424\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497811\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12054586808188021,\n \"acc_stderr\": 0.008968608285309088\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/Zenith-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|arc:challenge|25_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|gsm8k|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hellaswag|10_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T23-51-23.458204.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["**/details_harness|winogrande|5_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T23-51-23.458204.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T23_51_23.458204", "path": ["results_2024-02-13T23-51-23.458204.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T23-51-23.458204.parquet"]}]}]} | 2024-02-13T23:54:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Xenon1/Zenith-7B
Dataset automatically created during the evaluation run of model Xenon1/Zenith-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-13T23:51:23.458204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Xenon1/Zenith-7B\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Zenith-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T23:51:23.458204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Xenon1/Zenith-7B\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Zenith-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-13T23:51:23.458204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Xenon1/Zenith-7B\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Zenith-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-13T23:51:23.458204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
bbd3d06e9b309e35dfb4bad7cba7edaec96662db |
# Dataset Card for Evaluation run of Mihaiii/Bucharest-0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Bucharest-0.1](https://huggingface.co/Mihaiii/Bucharest-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Bucharest-0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T00:16:59.594031](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Bucharest-0.1/blob/main/results_2024-02-14T00-16-59.594031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.661247276384782,
"acc_stderr": 0.03141201491493503,
"acc_norm": 0.6641358272243135,
"acc_norm_stderr": 0.03203652707247171,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4793790433538082,
"mc2_stderr": 0.014619267505513112
},
"harness|arc:challenge|25": {
"acc": 0.6032423208191127,
"acc_stderr": 0.014296513020180642,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063232
},
"harness|hellaswag|10": {
"acc": 0.6644094801832304,
"acc_stderr": 0.00471231451195098,
"acc_norm": 0.854511053574985,
"acc_norm_stderr": 0.003518725257365604
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810535,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810535
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.02554284681740049,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.02554284681740049
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026784,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026784
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678185,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8623853211009175,
"acc_stderr": 0.014770105878649395,
"acc_norm": 0.8623853211009175,
"acc_norm_stderr": 0.014770105878649395
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3888268156424581,
"acc_stderr": 0.016303899530796123,
"acc_norm": 0.3888268156424581,
"acc_norm_stderr": 0.016303899530796123
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4941329856584094,
"acc_stderr": 0.012769356925216526,
"acc_norm": 0.4941329856584094,
"acc_norm_stderr": 0.012769356925216526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.018635594034423976,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.018635594034423976
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.03015113445777634,
"acc_norm": 0.9,
"acc_norm_stderr": 0.03015113445777634
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4793790433538082,
"mc2_stderr": 0.014619267505513112
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.010759352014855922
},
"harness|gsm8k|5": {
"acc": 0.5708870356330553,
"acc_stderr": 0.013633369425647232
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Mihaiii__Bucharest-0.1 | [
"region:us"
] | 2024-02-14T00:19:14+00:00 | {"pretty_name": "Evaluation run of Mihaiii/Bucharest-0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mihaiii/Bucharest-0.1](https://huggingface.co/Mihaiii/Bucharest-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Bucharest-0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T00:16:59.594031](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Bucharest-0.1/blob/main/results_2024-02-14T00-16-59.594031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.661247276384782,\n \"acc_stderr\": 0.03141201491493503,\n \"acc_norm\": 0.6641358272243135,\n \"acc_norm_stderr\": 0.03203652707247171,\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4793790433538082,\n \"mc2_stderr\": 0.014619267505513112\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6032423208191127,\n \"acc_stderr\": 0.014296513020180642,\n \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063232\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6644094801832304,\n \"acc_stderr\": 0.00471231451195098,\n \"acc_norm\": 0.854511053574985,\n \"acc_norm_stderr\": 0.003518725257365604\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810535,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810535\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.02554284681740049,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.02554284681740049\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n \"acc_stderr\": 0.02203721734026784,\n \"acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.02203721734026784\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678185,\n \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678185\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649395,\n \"acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649395\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3888268156424581,\n \"acc_stderr\": 0.016303899530796123,\n \"acc_norm\": 0.3888268156424581,\n \"acc_norm_stderr\": 0.016303899530796123\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4941329856584094,\n \"acc_stderr\": 0.012769356925216526,\n \"acc_norm\": 0.4941329856584094,\n \"acc_norm_stderr\": 0.012769356925216526\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.018635594034423976,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.018635594034423976\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4793790433538082,\n \"mc2_stderr\": 0.014619267505513112\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855922\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5708870356330553,\n \"acc_stderr\": 0.013633369425647232\n }\n}\n```", "repo_url": "https://huggingface.co/Mihaiii/Bucharest-0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|arc:challenge|25_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|gsm8k|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hellaswag|10_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T00-16-59.594031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["**/details_harness|winogrande|5_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T00-16-59.594031.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T00_16_59.594031", "path": ["results_2024-02-14T00-16-59.594031.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T00-16-59.594031.parquet"]}]}]} | 2024-02-14T00:19:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Mihaiii/Bucharest-0.1
Dataset automatically created during the evaluation run of model Mihaiii/Bucharest-0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T00:16:59.594031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Mihaiii/Bucharest-0.1\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Bucharest-0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T00:16:59.594031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Mihaiii/Bucharest-0.1\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Bucharest-0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T00:16:59.594031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mihaiii/Bucharest-0.1\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Bucharest-0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T00:16:59.594031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
3e326e9fe5a1ab071f431a4a5df3cc2c3f022ce6 |
# Dataset Card for Evaluation run of Test157t/Echidna-7b-128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Echidna-7b-128k](https://huggingface.co/Test157t/Echidna-7b-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Echidna-7b-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T02:20:02.752572](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Echidna-7b-128k/blob/main/results_2024-02-14T02-20-02.752572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6328967271943693,
"acc_stderr": 0.032637598920195965,
"acc_norm": 0.6346290203109765,
"acc_norm_stderr": 0.033298611822196275,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.5607222202477442,
"mc2_stderr": 0.015546296015638722
},
"harness|arc:challenge|25": {
"acc": 0.6296928327645052,
"acc_stderr": 0.01411129875167495,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.67805218084047,
"acc_stderr": 0.004662682233093781,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.0035454991695580518
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113729,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113729
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895514,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659356,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659356
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02959732973097808,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02959732973097808
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739153,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739153
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794086,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794086
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.01389086216287617,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.01389086216287617
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.01644970820902608,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.01644970820902608
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603742,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603742
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.012700582404768226,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.012700582404768226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573695,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573695
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.5607222202477442,
"mc2_stderr": 0.015546296015638722
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625868
},
"harness|gsm8k|5": {
"acc": 0.5686125852918877,
"acc_stderr": 0.013642195352511561
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Test157t__Echidna-7b-128k | [
"region:us"
] | 2024-02-14T00:45:19+00:00 | {"pretty_name": "Evaluation run of Test157t/Echidna-7b-128k", "dataset_summary": "Dataset automatically created during the evaluation run of model [Test157t/Echidna-7b-128k](https://huggingface.co/Test157t/Echidna-7b-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Echidna-7b-128k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T02:20:02.752572](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Echidna-7b-128k/blob/main/results_2024-02-14T02-20-02.752572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6328967271943693,\n \"acc_stderr\": 0.032637598920195965,\n \"acc_norm\": 0.6346290203109765,\n \"acc_norm_stderr\": 0.033298611822196275,\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.5607222202477442,\n \"mc2_stderr\": 0.015546296015638722\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6296928327645052,\n \"acc_stderr\": 0.01411129875167495,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.67805218084047,\n \"acc_stderr\": 0.004662682233093781,\n \"acc_norm\": 0.851822346146186,\n \"acc_norm_stderr\": 0.0035454991695580518\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113729,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113729\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895514,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659356,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659356\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02959732973097808,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02959732973097808\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739153,\n \"acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739153\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794086,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794086\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.01389086216287617,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.01389086216287617\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n \"acc_stderr\": 0.01644970820902608,\n \"acc_norm\": 0.4100558659217877,\n \"acc_norm_stderr\": 0.01644970820902608\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n \"acc_stderr\": 0.012700582404768226,\n \"acc_norm\": 0.44784876140808344,\n \"acc_norm_stderr\": 0.012700582404768226\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573695,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573695\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.5607222202477442,\n \"mc2_stderr\": 0.015546296015638722\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625868\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5686125852918877,\n \"acc_stderr\": 0.013642195352511561\n }\n}\n```", "repo_url": "https://huggingface.co/Test157t/Echidna-7b-128k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|arc:challenge|25_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|arc:challenge|25_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|gsm8k|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|gsm8k|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hellaswag|10_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hellaswag|10_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T00-43-01.888178.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T02-20-02.752572.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["**/details_harness|winogrande|5_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["**/details_harness|winogrande|5_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T02-20-02.752572.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T00_43_01.888178", "path": ["results_2024-02-14T00-43-01.888178.parquet"]}, {"split": "2024_02_14T02_20_02.752572", "path": ["results_2024-02-14T02-20-02.752572.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T02-20-02.752572.parquet"]}]}]} | 2024-02-14T02:22:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Test157t/Echidna-7b-128k
Dataset automatically created during the evaluation run of model Test157t/Echidna-7b-128k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T02:20:02.752572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Test157t/Echidna-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Echidna-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T02:20:02.752572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Test157t/Echidna-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Echidna-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T02:20:02.752572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Test157t/Echidna-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Echidna-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T02:20:02.752572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
588dcf45ae9fac3c24ff8f8d7a74cfadb536b33e | # Dataset Card for "race_all_fr"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Original Dataset Homepage](http://www.cs.cmu.edu/~glai1/data/race/)
- **Repository:** [Translated Dataset on Hugging Face](https://huggingface.co/datasets/race_all_fr)
- **Paper:** [Original Dataset Paper](https://arxiv.org/abs/1704.04683)
- **Leaderboard:** [Original Dataset Leaderboard](https://paperswithcode.com/dataset/race)
### Dataset Summary
`race_all_fr` est la version française du dataset [RACE](https://huggingface.co/datasets/race), un large dataset de compréhension de lecture comprenant plus de 28 000 passages et près de 100 000 questions. Le dataset original, conçu pour les étudiants des écoles secondaires et des collèges en Chine, a été traduit en français pour étendre son accessibilité et permettre des recherches en compréhension de lecture dans d'autres langues.
### Supported Tasks and Leaderboards
Les tâches et classements supportés restent identiques à ceux du dataset original RACE, adaptés pour la langue française.
### Languages
Le dataset est entièrement en français.
## Dataset Structure
### Data Instances
Les instances de données sont structurées de manière identique à celles du dataset original RACE, mais traduites en français.
### Data Fields
- `example_id`: un identifiant unique pour chaque exemple.
- `article`: le texte de l'article sur lequel se base les questions.
- `question`: la question posée.
- `options`: les quatre options de réponse fournies, où seulement une est correcte.
- `answer`: la lettre correspondant à la réponse correcte parmi les options.
### Data Splits
La répartition des données (train/validation/test) est la même que celle du dataset RACE original.
## Dataset Creation
### Curation Rationale
Ce dataset a été créé pour étendre les ressources disponibles pour la recherche en traitement automatique des langues (TAL) en français, spécifiquement pour la compréhension de la lecture.
### Source Data
#### Initial Data Collection and Normalization
Les données sources sont identiques à celles du dataset RACE, mais ont été traduites en français.
### Annotations
Les annotations restent inchangées par rapport à l'original, à l'exception de la langue.
### Personal and Sensitive Information
Les considérations sont les mêmes que pour le dataset RACE original.
## Considerations for Using the Data
### Social Impact of Dataset
La traduction de datasets en différentes langues est cruciale pour rendre la recherche en TAL accessible plus largement et pour permettre l'entraînement de modèles multilingues.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
Dataset traduit à l'aide de [Large_dataset_translator](https://github.com/vTuanpham/Large_dataset_translator) et l'API Google Translate.
### Licensing Information
Les mêmes licences que le dataset RACE original s'appliquent. Veuillez consulter le [lien suivant](http://www.cs.cmu.edu/~glai1/data/race/) pour plus d'informations.
### Citation Information
Veuillez citer le papier original du dataset RACE lors de l'utilisation de `race_all_fr` :
```bibtex
@inproceedings{lai-etal-2017-race,
title = "{RACE}: Large-scale {R}e{A}ding Comprehension Dataset From Examinations",
author = "Lai, Guokun and
Xie, Qizhe and
Liu, Hanxiao and
Yang, Yiming and
Hovy, Eduard",
booktitle = "Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing",
month = sep,
year = "2017",
address = "Copenhagen, Denmark",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D17-1082",
doi = "10.18653/v1/D17-1082",
pages = "785--794",
}
```
### Contributions
La traduction de ce dataset a été réalisée par [@MangoHiller](https://huggingface.co/MangoHiller). Pour les contributions originales, veuillez vous référer au dépôt GitHub du dataset RACE : [https://github.com/qizhex/RACE_AR_baselines](https://github.com/qizhex/RACE_AR_baselines). | MangoHiller/race_all_fr | [
"task_categories:multiple-choice",
"task_ids:multiple-choice-qa",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"source_datasets:https://huggingface.co/datasets/race",
"language:fr",
"license:other",
"arxiv:1704.04683",
"region:us"
] | 2024-02-14T00:53:12+00:00 | {"language": ["fr"], "license": "other", "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "source_datasets": ["https://huggingface.co/datasets/race"], "task_categories": ["multiple-choice"], "task_ids": ["multiple-choice-qa"], "pretty_name": "RACE_fr", "license_name": "other", "license_link": "LICENSE"} | 2024-02-15T23:31:40+00:00 | [
"1704.04683"
] | [
"fr"
] | TAGS
#task_categories-multiple-choice #task_ids-multiple-choice-qa #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-https-//huggingface.co/datasets/race #language-French #license-other #arxiv-1704.04683 #region-us
| # Dataset Card for "race_all_fr"
## Table of Contents
- Dataset Description
- Dataset Summary
- Supported Tasks and Leaderboards
- Languages
- Dataset Structure
- Data Instances
- Data Fields
- Data Splits
- Dataset Creation
- Curation Rationale
- Source Data
- Annotations
- Personal and Sensitive Information
- Considerations for Using the Data
- Social Impact of Dataset
- Discussion of Biases
- Other Known Limitations
- Additional Information
- Dataset Curators
- Licensing Information
- Citation Information
- Contributions
## Dataset Description
- Homepage: Original Dataset Homepage
- Repository: Translated Dataset on Hugging Face
- Paper: Original Dataset Paper
- Leaderboard: Original Dataset Leaderboard
### Dataset Summary
'race_all_fr' est la version française du dataset RACE, un large dataset de compréhension de lecture comprenant plus de 28 000 passages et près de 100 000 questions. Le dataset original, conçu pour les étudiants des écoles secondaires et des collèges en Chine, a été traduit en français pour étendre son accessibilité et permettre des recherches en compréhension de lecture dans d'autres langues.
### Supported Tasks and Leaderboards
Les tâches et classements supportés restent identiques à ceux du dataset original RACE, adaptés pour la langue française.
### Languages
Le dataset est entièrement en français.
## Dataset Structure
### Data Instances
Les instances de données sont structurées de manière identique à celles du dataset original RACE, mais traduites en français.
### Data Fields
- 'example_id': un identifiant unique pour chaque exemple.
- 'article': le texte de l'article sur lequel se base les questions.
- 'question': la question posée.
- 'options': les quatre options de réponse fournies, où seulement une est correcte.
- 'answer': la lettre correspondant à la réponse correcte parmi les options.
### Data Splits
La répartition des données (train/validation/test) est la même que celle du dataset RACE original.
## Dataset Creation
### Curation Rationale
Ce dataset a été créé pour étendre les ressources disponibles pour la recherche en traitement automatique des langues (TAL) en français, spécifiquement pour la compréhension de la lecture.
### Source Data
#### Initial Data Collection and Normalization
Les données sources sont identiques à celles du dataset RACE, mais ont été traduites en français.
### Annotations
Les annotations restent inchangées par rapport à l'original, à l'exception de la langue.
### Personal and Sensitive Information
Les considérations sont les mêmes que pour le dataset RACE original.
## Considerations for Using the Data
### Social Impact of Dataset
La traduction de datasets en différentes langues est cruciale pour rendre la recherche en TAL accessible plus largement et pour permettre l'entraînement de modèles multilingues.
### Discussion of Biases
### Other Known Limitations
## Additional Information
Dataset traduit à l'aide de Large_dataset_translator et l'API Google Translate.
### Licensing Information
Les mêmes licences que le dataset RACE original s'appliquent. Veuillez consulter le lien suivant pour plus d'informations.
Veuillez citer le papier original du dataset RACE lors de l'utilisation de 'race_all_fr' :
### Contributions
La traduction de ce dataset a été réalisée par @MangoHiller. Pour les contributions originales, veuillez vous référer au dépôt GitHub du dataset RACE : URL | [
"# Dataset Card for \"race_all_fr\"",
"## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: Original Dataset Homepage\n- Repository: Translated Dataset on Hugging Face\n- Paper: Original Dataset Paper\n- Leaderboard: Original Dataset Leaderboard",
"### Dataset Summary\n\n'race_all_fr' est la version française du dataset RACE, un large dataset de compréhension de lecture comprenant plus de 28 000 passages et près de 100 000 questions. Le dataset original, conçu pour les étudiants des écoles secondaires et des collèges en Chine, a été traduit en français pour étendre son accessibilité et permettre des recherches en compréhension de lecture dans d'autres langues.",
"### Supported Tasks and Leaderboards\n\nLes tâches et classements supportés restent identiques à ceux du dataset original RACE, adaptés pour la langue française.",
"### Languages\n\nLe dataset est entièrement en français.",
"## Dataset Structure",
"### Data Instances\n\nLes instances de données sont structurées de manière identique à celles du dataset original RACE, mais traduites en français.",
"### Data Fields\n\n- 'example_id': un identifiant unique pour chaque exemple.\n- 'article': le texte de l'article sur lequel se base les questions.\n- 'question': la question posée.\n- 'options': les quatre options de réponse fournies, où seulement une est correcte.\n- 'answer': la lettre correspondant à la réponse correcte parmi les options.",
"### Data Splits\n\nLa répartition des données (train/validation/test) est la même que celle du dataset RACE original.",
"## Dataset Creation",
"### Curation Rationale\n\nCe dataset a été créé pour étendre les ressources disponibles pour la recherche en traitement automatique des langues (TAL) en français, spécifiquement pour la compréhension de la lecture.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nLes données sources sont identiques à celles du dataset RACE, mais ont été traduites en français.",
"### Annotations\n\nLes annotations restent inchangées par rapport à l'original, à l'exception de la langue.",
"### Personal and Sensitive Information\n\nLes considérations sont les mêmes que pour le dataset RACE original.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nLa traduction de datasets en différentes langues est cruciale pour rendre la recherche en TAL accessible plus largement et pour permettre l'entraînement de modèles multilingues.",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information\n\nDataset traduit à l'aide de Large_dataset_translator et l'API Google Translate.",
"### Licensing Information\n\nLes mêmes licences que le dataset RACE original s'appliquent. Veuillez consulter le lien suivant pour plus d'informations.\n\n\n\nVeuillez citer le papier original du dataset RACE lors de l'utilisation de 'race_all_fr' :",
"### Contributions\n\nLa traduction de ce dataset a été réalisée par @MangoHiller. Pour les contributions originales, veuillez vous référer au dépôt GitHub du dataset RACE : URL"
] | [
"TAGS\n#task_categories-multiple-choice #task_ids-multiple-choice-qa #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-https-//huggingface.co/datasets/race #language-French #license-other #arxiv-1704.04683 #region-us \n",
"# Dataset Card for \"race_all_fr\"",
"## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: Original Dataset Homepage\n- Repository: Translated Dataset on Hugging Face\n- Paper: Original Dataset Paper\n- Leaderboard: Original Dataset Leaderboard",
"### Dataset Summary\n\n'race_all_fr' est la version française du dataset RACE, un large dataset de compréhension de lecture comprenant plus de 28 000 passages et près de 100 000 questions. Le dataset original, conçu pour les étudiants des écoles secondaires et des collèges en Chine, a été traduit en français pour étendre son accessibilité et permettre des recherches en compréhension de lecture dans d'autres langues.",
"### Supported Tasks and Leaderboards\n\nLes tâches et classements supportés restent identiques à ceux du dataset original RACE, adaptés pour la langue française.",
"### Languages\n\nLe dataset est entièrement en français.",
"## Dataset Structure",
"### Data Instances\n\nLes instances de données sont structurées de manière identique à celles du dataset original RACE, mais traduites en français.",
"### Data Fields\n\n- 'example_id': un identifiant unique pour chaque exemple.\n- 'article': le texte de l'article sur lequel se base les questions.\n- 'question': la question posée.\n- 'options': les quatre options de réponse fournies, où seulement une est correcte.\n- 'answer': la lettre correspondant à la réponse correcte parmi les options.",
"### Data Splits\n\nLa répartition des données (train/validation/test) est la même que celle du dataset RACE original.",
"## Dataset Creation",
"### Curation Rationale\n\nCe dataset a été créé pour étendre les ressources disponibles pour la recherche en traitement automatique des langues (TAL) en français, spécifiquement pour la compréhension de la lecture.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nLes données sources sont identiques à celles du dataset RACE, mais ont été traduites en français.",
"### Annotations\n\nLes annotations restent inchangées par rapport à l'original, à l'exception de la langue.",
"### Personal and Sensitive Information\n\nLes considérations sont les mêmes que pour le dataset RACE original.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nLa traduction de datasets en différentes langues est cruciale pour rendre la recherche en TAL accessible plus largement et pour permettre l'entraînement de modèles multilingues.",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information\n\nDataset traduit à l'aide de Large_dataset_translator et l'API Google Translate.",
"### Licensing Information\n\nLes mêmes licences que le dataset RACE original s'appliquent. Veuillez consulter le lien suivant pour plus d'informations.\n\n\n\nVeuillez citer le papier original du dataset RACE lors de l'utilisation de 'race_all_fr' :",
"### Contributions\n\nLa traduction de ce dataset a été réalisée par @MangoHiller. Pour les contributions originales, veuillez vous référer au dépôt GitHub du dataset RACE : URL"
] | [
91,
12,
120,
41,
96,
38,
12,
6,
36,
91,
33,
5,
44,
4,
33,
30,
26,
8,
42,
8,
7,
30,
63,
47
] | [
"passage: TAGS\n#task_categories-multiple-choice #task_ids-multiple-choice-qa #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-https-//huggingface.co/datasets/race #language-French #license-other #arxiv-1704.04683 #region-us \n# Dataset Card for \"race_all_fr\"## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions## Dataset Description\n\n- Homepage: Original Dataset Homepage\n- Repository: Translated Dataset on Hugging Face\n- Paper: Original Dataset Paper\n- Leaderboard: Original Dataset Leaderboard### Dataset Summary\n\n'race_all_fr' est la version française du dataset RACE, un large dataset de compréhension de lecture comprenant plus de 28 000 passages et près de 100 000 questions. Le dataset original, conçu pour les étudiants des écoles secondaires et des collèges en Chine, a été traduit en français pour étendre son accessibilité et permettre des recherches en compréhension de lecture dans d'autres langues.### Supported Tasks and Leaderboards\n\nLes tâches et classements supportés restent identiques à ceux du dataset original RACE, adaptés pour la langue française.### Languages\n\nLe dataset est entièrement en français.## Dataset Structure### Data Instances\n\nLes instances de données sont structurées de manière identique à celles du dataset original RACE, mais traduites en français."
] |
5e253679be948dcd21329c15f3a8a9d4526da5ce |
Based on the ukr_pravda dataset: https://huggingface.co/datasets/shamotskyi/ukr_pravda_2y. Licensed as CC-BY-NC 4.0.
For each article, its text and titles are given, as well as _masked_ text and title (with all digits replaced with "X").
The indexes of the similar articles refer to the ids in the ukr_pravda_2y dataset (TODO check if this is factually correct).
Then, as ML eval task, a choice of 10 masked titles from _similar_ articles are given (including the 'real' one). The `label` column points to the index of the correct masked title.
Similarity of articles is a dead-simple cosine distance over binary vectors of the articles tags:
- a vector is built using spacy CountVectorizer, with 0 if the tag is absent and 1 if present
- similarity is cosine distance between these vectors of two articles
- the 10 most similar articles' titles are taken
NB this simple similarity may be suboptimal, because there are MANY UP articles with the exact same tags ("Україна, Росія, Вагнер").
This is done in the context of my Master's thesis, better documentation will follow soon. | shamotskyi/ukr_pravda_titles_ukr | [
"language:uk",
"region:us"
] | 2024-02-14T00:56:28+00:00 | {"language": ["uk"]} | 2024-02-15T13:50:36+00:00 | [] | [
"uk"
] | TAGS
#language-Ukrainian #region-us
|
Based on the ukr_pravda dataset: URL Licensed as CC-BY-NC 4.0.
For each article, its text and titles are given, as well as _masked_ text and title (with all digits replaced with "X").
The indexes of the similar articles refer to the ids in the ukr_pravda_2y dataset (TODO check if this is factually correct).
Then, as ML eval task, a choice of 10 masked titles from _similar_ articles are given (including the 'real' one). The 'label' column points to the index of the correct masked title.
Similarity of articles is a dead-simple cosine distance over binary vectors of the articles tags:
- a vector is built using spacy CountVectorizer, with 0 if the tag is absent and 1 if present
- similarity is cosine distance between these vectors of two articles
- the 10 most similar articles' titles are taken
NB this simple similarity may be suboptimal, because there are MANY UP articles with the exact same tags ("Україна, Росія, Вагнер").
This is done in the context of my Master's thesis, better documentation will follow soon. | [] | [
"TAGS\n#language-Ukrainian #region-us \n"
] | [
13
] | [
"passage: TAGS\n#language-Ukrainian #region-us \n"
] |
dceed353dadb2e3e33e66c1856fafbd356a7835b |
- Based on the ukr_pravda dataset: https://huggingface.co/datasets/shamotskyi/ukr_pravda_2y
- Sister dataset: https://huggingface.co/datasets/shamotskyi/ukr_pravda_titles_ukr (same but in Ukrainian)
For each article, its text and titles are given, as well as _masked_ text and title (with all digits replaced with "X").
The indexes of the similar articles refer to the ids in the ukr_pravda_2y dataset (TODO check if this is factually correct).
Then, as ML eval task, a choice of 10 masked titles from _similar_ articles are given (including the 'real' one). The `label` column points to the index of the correct masked title.
Similarity of articles is a dead-simple cosine distance over binary vectors of the articles tags:
- a vector is built using spacy CountVectorizer, with 0 if the tag is absent and 1 if present
- similarity is cosine distance between these vectors of two articles
- the 10 most similar articles' titles are taken
NB this simple similarity may be suboptimal, because there are MANY UP articles with the exact same tags (ergo all with similarity 1.0 to the source), and there may be more similar articles (by human intuition as well as by any more reasonable metric) than the ones actually chosen.
This is done in the context of my Master's thesis, better documentation will follow soon. | shamotskyi/ukr_pravda_titles_eng | [
"language:en",
"license:cc-by-nc-4.0",
"news",
"region:us"
] | 2024-02-14T00:59:23+00:00 | {"language": ["en"], "license": "cc-by-nc-4.0", "tags": ["news"]} | 2024-02-15T13:53:55+00:00 | [] | [
"en"
] | TAGS
#language-English #license-cc-by-nc-4.0 #news #region-us
|
- Based on the ukr_pravda dataset: URL
- Sister dataset: URL (same but in Ukrainian)
For each article, its text and titles are given, as well as _masked_ text and title (with all digits replaced with "X").
The indexes of the similar articles refer to the ids in the ukr_pravda_2y dataset (TODO check if this is factually correct).
Then, as ML eval task, a choice of 10 masked titles from _similar_ articles are given (including the 'real' one). The 'label' column points to the index of the correct masked title.
Similarity of articles is a dead-simple cosine distance over binary vectors of the articles tags:
- a vector is built using spacy CountVectorizer, with 0 if the tag is absent and 1 if present
- similarity is cosine distance between these vectors of two articles
- the 10 most similar articles' titles are taken
NB this simple similarity may be suboptimal, because there are MANY UP articles with the exact same tags (ergo all with similarity 1.0 to the source), and there may be more similar articles (by human intuition as well as by any more reasonable metric) than the ones actually chosen.
This is done in the context of my Master's thesis, better documentation will follow soon. | [] | [
"TAGS\n#language-English #license-cc-by-nc-4.0 #news #region-us \n"
] | [
23
] | [
"passage: TAGS\n#language-English #license-cc-by-nc-4.0 #news #region-us \n"
] |
925b1c266e530cf96809a81d4e1afbf8a4ca8f34 | # Dataset Card for "evesix-level0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sam-mosaic/evesix-level0 | [
"region:us"
] | 2024-02-14T01:14:07+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 761935742, "num_examples": 486455}], "download_size": 384732088, "dataset_size": 761935742}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T01:14:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "evesix-level0"
More Information needed | [
"# Dataset Card for \"evesix-level0\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"evesix-level0\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"evesix-level0\"\n\nMore Information needed"
] |
65a97a572466ac000bf4944d2faba0783f1cd159 | # Dataset Card for "filtered-ultrachat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | chiennv/filtered-ultrachat | [
"region:us"
] | 2024-02-14T01:53:47+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 389356552, "num_examples": 48789}], "download_size": 167189919, "dataset_size": 389356552}} | 2024-02-14T01:53:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "filtered-ultrachat"
More Information needed | [
"# Dataset Card for \"filtered-ultrachat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"filtered-ultrachat\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"filtered-ultrachat\"\n\nMore Information needed"
] |
Subsets and Splits