sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
7efa785fe3ffead8088b5f55fb072bc9c78ec7fe |
# SJ-Donald/kor-hate-sentence
SJ-Donald/kor-hate-sentence is merged dataset from fllow
## Datasets
* [smilegate-ai/kor_unsmile](https://huggingface.co/datasets/smilegate-ai/kor_unsmile)
* [korean-hate-speech](https://github.com/kocohub/korean-hate-speech/tree/master)
* [Curse-detection-data](https://github.com/2runo/Curse-detection-data)
* [korean-malicious-comments-dataset](https://github.com/seongwoong/korean-malicious-comments-dataset)
Merge datasets from above and drop duplicates.
## How to use
```Python
from datasets import load_dataset
ds = load_dataset("SJ-Donald/kor-hate-sentence")
print(ds)
DatasetDict({
train: Dataset({
features: ['문장', 'hate', 'clean', 'labels'],
num_rows: 29328
})
test: Dataset({
features: ['문장', 'hate', 'clean', 'labels'],
num_rows: 7333
})
})
``` | SJ-Donald/kor-hate-sentence | [
"license:cc-by-sa-4.0",
"korean",
"hate-speech",
"hate-sentence",
"region:us"
] | 2024-01-16T04:01:50+00:00 | {"license": "cc-by-sa-4.0", "tags": ["korean", "hate-speech", "hate-sentence"]} | 2024-01-24T02:27:43+00:00 | [] | [] | TAGS
#license-cc-by-sa-4.0 #korean #hate-speech #hate-sentence #region-us
|
# SJ-Donald/kor-hate-sentence
SJ-Donald/kor-hate-sentence is merged dataset from fllow
## Datasets
* smilegate-ai/kor_unsmile
* korean-hate-speech
* Curse-detection-data
* korean-malicious-comments-dataset
Merge datasets from above and drop duplicates.
## How to use
| [
"# SJ-Donald/kor-hate-sentence\n\nSJ-Donald/kor-hate-sentence is merged dataset from fllow",
"## Datasets\n\n* smilegate-ai/kor_unsmile\n* korean-hate-speech\n* Curse-detection-data\n* korean-malicious-comments-dataset\n\nMerge datasets from above and drop duplicates.",
"## How to use"
] | [
"TAGS\n#license-cc-by-sa-4.0 #korean #hate-speech #hate-sentence #region-us \n",
"# SJ-Donald/kor-hate-sentence\n\nSJ-Donald/kor-hate-sentence is merged dataset from fllow",
"## Datasets\n\n* smilegate-ai/kor_unsmile\n* korean-hate-speech\n* Curse-detection-data\n* korean-malicious-comments-dataset\n\nMerge datasets from above and drop duplicates.",
"## How to use"
] |
1325769d0f484a510bb435fc900bba469efbc804 |
# Dataset Card for Evaluation run of alignment-handbook/zephyr-7b-sft-full
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alignment-handbook/zephyr-7b-sft-full](https://huggingface.co/alignment-handbook/zephyr-7b-sft-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alignment-handbook__zephyr-7b-sft-full",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T04:10:47.293422](https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-sft-full/blob/main/results_2024-01-16T04-10-47.293422.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5996645941169282,
"acc_stderr": 0.03304659614732094,
"acc_norm": 0.606170977115199,
"acc_norm_stderr": 0.03373440620721248,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627897,
"mc2": 0.4170825132034481,
"mc2_stderr": 0.014670567942290037
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.014555949760496442,
"acc_norm": 0.5767918088737202,
"acc_norm_stderr": 0.014438036220848036
},
"harness|hellaswag|10": {
"acc": 0.608743278231428,
"acc_stderr": 0.004870342592915048,
"acc_norm": 0.8082055367456682,
"acc_norm_stderr": 0.003929076276473378
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.038607315993160904,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.038607315993160904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357335,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929778,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940798,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940798
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847834,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847834
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709588,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709588
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.02500931379006971,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.02500931379006971
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0267874531119065,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0267874531119065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893934,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893934
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811943,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811943
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354032,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627897,
"mc2": 0.4170825132034481,
"mc2_stderr": 0.014670567942290037
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843905
},
"harness|gsm8k|5": {
"acc": 0.287338893100834,
"acc_stderr": 0.012464677060107078
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alignment-handbook__zephyr-7b-sft-full | [
"region:us"
] | 2024-01-16T04:09:14+00:00 | {"pretty_name": "Evaluation run of alignment-handbook/zephyr-7b-sft-full", "dataset_summary": "Dataset automatically created during the evaluation run of model [alignment-handbook/zephyr-7b-sft-full](https://huggingface.co/alignment-handbook/zephyr-7b-sft-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alignment-handbook__zephyr-7b-sft-full\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T04:10:47.293422](https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-sft-full/blob/main/results_2024-01-16T04-10-47.293422.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5996645941169282,\n \"acc_stderr\": 0.03304659614732094,\n \"acc_norm\": 0.606170977115199,\n \"acc_norm_stderr\": 0.03373440620721248,\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.015702107090627897,\n \"mc2\": 0.4170825132034481,\n \"mc2_stderr\": 0.014670567942290037\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.014555949760496442,\n \"acc_norm\": 0.5767918088737202,\n \"acc_norm_stderr\": 0.014438036220848036\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.608743278231428,\n \"acc_stderr\": 0.004870342592915048,\n \"acc_norm\": 0.8082055367456682,\n \"acc_norm_stderr\": 0.003929076276473378\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.038607315993160904,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.038607315993160904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03899073687357335,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03899073687357335\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929778,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940798,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940798\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693264,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693264\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847834,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847834\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.014866821664709588,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.014866821664709588\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.02500931379006971,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.02500931379006971\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0267874531119065,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0267874531119065\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n \"acc_stderr\": 0.012628393551811943,\n \"acc_norm\": 0.4256844850065189,\n \"acc_norm_stderr\": 0.012628393551811943\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354032,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354032\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.015702107090627897,\n \"mc2\": 0.4170825132034481,\n \"mc2_stderr\": 0.014670567942290037\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843905\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.287338893100834,\n \"acc_stderr\": 0.012464677060107078\n }\n}\n```", "repo_url": "https://huggingface.co/alignment-handbook/zephyr-7b-sft-full", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|arc:challenge|25_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|arc:challenge|25_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|gsm8k|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|gsm8k|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hellaswag|10_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hellaswag|10_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T04-06-55.134598.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T04-10-47.293422.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["**/details_harness|winogrande|5_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["**/details_harness|winogrande|5_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T04-10-47.293422.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T04_06_55.134598", "path": ["results_2024-01-16T04-06-55.134598.parquet"]}, {"split": "2024_01_16T04_10_47.293422", "path": ["results_2024-01-16T04-10-47.293422.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T04-10-47.293422.parquet"]}]}]} | 2024-01-16T04:13:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alignment-handbook/zephyr-7b-sft-full
Dataset automatically created during the evaluation run of model alignment-handbook/zephyr-7b-sft-full on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T04:10:47.293422(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alignment-handbook/zephyr-7b-sft-full\n\n\n\nDataset automatically created during the evaluation run of model alignment-handbook/zephyr-7b-sft-full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T04:10:47.293422(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alignment-handbook/zephyr-7b-sft-full\n\n\n\nDataset automatically created during the evaluation run of model alignment-handbook/zephyr-7b-sft-full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T04:10:47.293422(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
23d872c9329fcf050bc42992ba89de4649afd24c |
Used TheBloke/OpenHermes-2-Mistral-7B-GPTQ to convert chunks into QA pairs used for finetuning | ogbrandt/pjf-podcast-qa-sharegpt | [
"license:apache-2.0",
"region:us"
] | 2024-01-16T04:11:20+00:00 | {"license": "apache-2.0"} | 2024-01-16T04:15:57+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
Used TheBloke/OpenHermes-2-Mistral-7B-GPTQ to convert chunks into QA pairs used for finetuning | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
d3b944798d48575d2a35623609b5b7df826c12ed |
This is a subset (1000 samples) of [`timdettmers/openassistant-guanaco`](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) dataset, processed to match Mistral-7B-instruct-v0.2's prompt format as described [in this article](https://huggingface.co/blog/llama2#how-to-prompt-llama-2). It was created using the [colab notebook](https://colab.research.google.com/drive/1afeicfJa9Mo8-wEcDoGrjyoVLyFkF9xm?usp=sharing).
Inspired by Maxime Labonne's [llm-course repo](https://github.com/mlabonne/llm-course). | wenqiglantz/guanaco-llama2-1k | [
"region:us"
] | 2024-01-16T04:27:02+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1654448, "num_examples": 1000}], "download_size": 966694, "dataset_size": 1654448}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T04:29:30+00:00 | [] | [] | TAGS
#region-us
|
This is a subset (1000 samples) of 'timdettmers/openassistant-guanaco' dataset, processed to match Mistral-7B-instruct-v0.2's prompt format as described in this article. It was created using the colab notebook.
Inspired by Maxime Labonne's llm-course repo. | [] | [
"TAGS\n#region-us \n"
] |
952b12aba069f8ea09defee6555b1f1e320d0b26 |
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO](https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-Mixtral-8x7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T17:09:50.643842](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-Mixtral-8x7B-DPO/blob/main/results_2024-01-22T17-09-50.643842.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7224125718980299,
"acc_stderr": 0.030022741290236767,
"acc_norm": 0.7240829737285515,
"acc_norm_stderr": 0.03062607991215834,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.01705876150134797,
"mc2": 0.5482610472622913,
"mc2_stderr": 0.014924708991833662
},
"harness|arc:challenge|25": {
"acc": 0.6953924914675768,
"acc_stderr": 0.013449522109932487,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.6870145389364668,
"acc_stderr": 0.004627607991626919,
"acc_norm": 0.8729336785500896,
"acc_norm_stderr": 0.0033236659644121946
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.03855289616378948,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.03855289616378948
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.025750949678130387,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.025750949678130387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8516129032258064,
"acc_stderr": 0.020222737554330378,
"acc_norm": 0.8516129032258064,
"acc_norm_stderr": 0.020222737554330378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5911330049261084,
"acc_stderr": 0.03459058815883233,
"acc_norm": 0.5911330049261084,
"acc_norm_stderr": 0.03459058815883233
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603915,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240524,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8954128440366973,
"acc_stderr": 0.013120530245265587,
"acc_norm": 0.8954128440366973,
"acc_norm_stderr": 0.013120530245265587
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.023784297520918853,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.023784297520918853
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758556,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758556
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.02856807946471428,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.02856807946471428
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342344,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342344
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.879948914431673,
"acc_stderr": 0.011622736692041268,
"acc_norm": 0.879948914431673,
"acc_norm_stderr": 0.011622736692041268
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.02151190065425255,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.02151190065425255
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5720670391061452,
"acc_stderr": 0.016547887997416112,
"acc_norm": 0.5720670391061452,
"acc_norm_stderr": 0.016547887997416112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8071895424836601,
"acc_stderr": 0.022589318888176703,
"acc_norm": 0.8071895424836601,
"acc_norm_stderr": 0.022589318888176703
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7909967845659164,
"acc_stderr": 0.02309314039837422,
"acc_norm": 0.7909967845659164,
"acc_norm_stderr": 0.02309314039837422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8549382716049383,
"acc_stderr": 0.019594877019727962,
"acc_norm": 0.8549382716049383,
"acc_norm_stderr": 0.019594877019727962
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.02977945095730305,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.02977945095730305
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5554106910039114,
"acc_stderr": 0.012691575792657112,
"acc_norm": 0.5554106910039114,
"acc_norm_stderr": 0.012691575792657112
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7867647058823529,
"acc_stderr": 0.02488097151229426,
"acc_norm": 0.7867647058823529,
"acc_norm_stderr": 0.02488097151229426
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.016547148636203147,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.016547148636203147
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546198,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546198
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789255,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789255
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.01705876150134797,
"mc2": 0.5482610472622913,
"mc2_stderr": 0.014924708991833662
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838911
},
"harness|gsm8k|5": {
"acc": 0.7164518574677786,
"acc_stderr": 0.012415070917508125
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-Mixtral-8x7B-DPO | [
"region:us"
] | 2024-01-16T04:46:35+00:00 | {"pretty_name": "Evaluation run of NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO](https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-Mixtral-8x7B-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-22T17:09:50.643842](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-Mixtral-8x7B-DPO/blob/main/results_2024-01-22T17-09-50.643842.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7224125718980299,\n \"acc_stderr\": 0.030022741290236767,\n \"acc_norm\": 0.7240829737285515,\n \"acc_norm_stderr\": 0.03062607991215834,\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.01705876150134797,\n \"mc2\": 0.5482610472622913,\n \"mc2_stderr\": 0.014924708991833662\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6953924914675768,\n \"acc_stderr\": 0.013449522109932487,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6870145389364668,\n \"acc_stderr\": 0.004627607991626919,\n \"acc_norm\": 0.8729336785500896,\n \"acc_norm_stderr\": 0.0033236659644121946\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.02495991802891127,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.02495991802891127\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.03855289616378948,\n \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03855289616378948\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5026455026455027,\n \"acc_stderr\": 0.025750949678130387,\n \"acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.025750949678130387\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8516129032258064,\n \"acc_stderr\": 0.020222737554330378,\n \"acc_norm\": 0.8516129032258064,\n \"acc_norm_stderr\": 0.020222737554330378\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5911330049261084,\n \"acc_stderr\": 0.03459058815883233,\n \"acc_norm\": 0.5911330049261084,\n \"acc_norm_stderr\": 0.03459058815883233\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603915,\n \"acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240524,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240524\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.02345467488940429,\n \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.02345467488940429\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.026265024608275882,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.026265024608275882\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8954128440366973,\n \"acc_stderr\": 0.013120530245265587,\n \"acc_norm\": 0.8954128440366973,\n \"acc_norm_stderr\": 0.013120530245265587\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8676470588235294,\n \"acc_stderr\": 0.023784297520918853,\n \"acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.023784297520918853\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758556,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758556\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n \"acc_stderr\": 0.02856807946471428,\n \"acc_norm\": 0.7623318385650224,\n \"acc_norm_stderr\": 0.02856807946471428\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342344,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342344\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.879948914431673,\n \"acc_stderr\": 0.011622736692041268,\n \"acc_norm\": 0.879948914431673,\n \"acc_norm_stderr\": 0.011622736692041268\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.02151190065425255,\n \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.02151190065425255\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5720670391061452,\n \"acc_stderr\": 0.016547887997416112,\n \"acc_norm\": 0.5720670391061452,\n \"acc_norm_stderr\": 0.016547887997416112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.022589318888176703,\n \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.022589318888176703\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7909967845659164,\n \"acc_stderr\": 0.02309314039837422,\n \"acc_norm\": 0.7909967845659164,\n \"acc_norm_stderr\": 0.02309314039837422\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8549382716049383,\n \"acc_stderr\": 0.019594877019727962,\n \"acc_norm\": 0.8549382716049383,\n \"acc_norm_stderr\": 0.019594877019727962\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.02977945095730305,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.02977945095730305\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5554106910039114,\n \"acc_stderr\": 0.012691575792657112,\n \"acc_norm\": 0.5554106910039114,\n \"acc_norm_stderr\": 0.012691575792657112\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7867647058823529,\n \"acc_stderr\": 0.02488097151229426,\n \"acc_norm\": 0.7867647058823529,\n \"acc_norm_stderr\": 0.02488097151229426\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.016547148636203147,\n \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.016547148636203147\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546198,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546198\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789255,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789255\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.01705876150134797,\n \"mc2\": 0.5482610472622913,\n \"mc2_stderr\": 0.014924708991833662\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7164518574677786,\n \"acc_stderr\": 0.012415070917508125\n }\n}\n```", "repo_url": "https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|arc:challenge|25_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|arc:challenge|25_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|gsm8k|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|gsm8k|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hellaswag|10_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hellaswag|10_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T04-44-16.630676.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-22T17-09-50.643842.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["**/details_harness|winogrande|5_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["**/details_harness|winogrande|5_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-22T17-09-50.643842.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T04_44_16.630676", "path": ["results_2024-01-16T04-44-16.630676.parquet"]}, {"split": "2024_01_22T17_09_50.643842", "path": ["results_2024-01-22T17-09-50.643842.parquet"]}, {"split": "latest", "path": ["results_2024-01-22T17-09-50.643842.parquet"]}]}]} | 2024-01-22T17:12:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
Dataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-22T17:09:50.643842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T17:09:50.643842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-22T17:09:50.643842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
72f079c5287007a6f6628f089c29c048d1bf5d81 |
# Dataset Card for Evaluation run of fionazhang/mistral-environment-all
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fionazhang/mistral-environment-all](https://huggingface.co/fionazhang/mistral-environment-all) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fionazhang__mistral-environment-all",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T05:12:37.264031](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-environment-all/blob/main/results_2024-01-16T05-12-37.264031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2317362073283127,
"acc_stderr": 0.029930955666961398,
"acc_norm": 0.23270999956714858,
"acc_norm_stderr": 0.030730298088089192,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.479195628849322,
"mc2_stderr": 0.016335101476581883
},
"harness|arc:challenge|25": {
"acc": 0.21331058020477817,
"acc_stderr": 0.011970971742326334,
"acc_norm": 0.29436860068259385,
"acc_norm_stderr": 0.01331852846053943
},
"harness|hellaswag|10": {
"acc": 0.2590121489743079,
"acc_stderr": 0.004371969542814559,
"acc_norm": 0.25891256721768574,
"acc_norm_stderr": 0.004371422731216411
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.479195628849322,
"mc2_stderr": 0.016335101476581883
},
"harness|winogrande|5": {
"acc": 0.48697711128650356,
"acc_stderr": 0.014047718393997662
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fionazhang__mistral-environment-all | [
"region:us"
] | 2024-01-16T05:14:59+00:00 | {"pretty_name": "Evaluation run of fionazhang/mistral-environment-all", "dataset_summary": "Dataset automatically created during the evaluation run of model [fionazhang/mistral-environment-all](https://huggingface.co/fionazhang/mistral-environment-all) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fionazhang__mistral-environment-all\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T05:12:37.264031](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-environment-all/blob/main/results_2024-01-16T05-12-37.264031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2317362073283127,\n \"acc_stderr\": 0.029930955666961398,\n \"acc_norm\": 0.23270999956714858,\n \"acc_norm_stderr\": 0.030730298088089192,\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.479195628849322,\n \"mc2_stderr\": 0.016335101476581883\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21331058020477817,\n \"acc_stderr\": 0.011970971742326334,\n \"acc_norm\": 0.29436860068259385,\n \"acc_norm_stderr\": 0.01331852846053943\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2590121489743079,\n \"acc_stderr\": 0.004371969542814559,\n \"acc_norm\": 0.25891256721768574,\n \"acc_norm_stderr\": 0.004371422731216411\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.479195628849322,\n \"mc2_stderr\": 0.016335101476581883\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.48697711128650356,\n \"acc_stderr\": 0.014047718393997662\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/fionazhang/mistral-environment-all", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|arc:challenge|25_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|gsm8k|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hellaswag|10_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T05-12-37.264031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["**/details_harness|winogrande|5_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T05-12-37.264031.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T05_12_37.264031", "path": ["results_2024-01-16T05-12-37.264031.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T05-12-37.264031.parquet"]}]}]} | 2024-01-16T05:15:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fionazhang/mistral-environment-all
Dataset automatically created during the evaluation run of model fionazhang/mistral-environment-all on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T05:12:37.264031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fionazhang/mistral-environment-all\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/mistral-environment-all on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T05:12:37.264031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fionazhang/mistral-environment-all\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/mistral-environment-all on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T05:12:37.264031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
92becd0bc36f9a606c539badbf3a41911771cfe8 |
# Dataset Card for FairFace
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Repository:** [https://github.com/joojs/fairface](https://github.com/joojs/fairface)
- **Paper:** [FairFace: Face Attribute Dataset for Balanced Race, Gender, and Age for Bias Measurement and Mitigation](https://openaccess.thecvf.com/content/WACV2021/html/Karkkainen_FairFace_Face_Attribute_Dataset_for_Balanced_Race_Gender_and_Age_WACV_2021_paper.html)
### Dataset Summary
A dataset of human faces annotated with discrete categories for the photographed person's age, sex, and race. Please consider prioritizing [a previously created Hugging Face dataset repository for Fair Face](https://huggingface.co/datasets/HuggingFaceM4/FairFace) as this new dataset repository was only made for downloading issues that may already be resolved.
For complete details on the dataset's construction and intended uses, please refer to the dataset's official repository or paper.
## Dataset Structure
### Data Instances
Each instance contains an image and discrete categories for age, gender, and race.
```
{
'file': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=448x448>,
'age': '50-59',
'gender': 'Male',
'race': 'East Asian',
'service_test': True
}
```
### Data Fields
- `file`: an image of a human face with either padding = 0.25 or padding = 1.25 depending on the dataset config
- `age`: a category describing the age of the person in the image limited to *0-2*, *3-9*, *10-19*, *20-29*, *30-39*, *40-49*, *50-59*, *60-69*, and *more than 70*
- `gender`: a category describing the sex of the person in the image limited to *Male* and *Female*
- `race`: a category describing the race of the person in the image limited to *East Asian*, *Indian*, *Black*, *White*, *Middle Eastern*, *Latino_Hispanic*, and *Southeast Asian*
- `service_test`: please refer to this [issue](https://github.com/dchen236/FairFace/issues/20) from the dataset's official repository
## Additional Information
### Licensing Information
According to the official repository, FairFace is licensed under [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/).
### Citation Information
```
@InProceedings{Karkkainen_2021_WACV,
author = {Karkkainen, Kimmo and Joo, Jungseock},
title = {FairFace: Face Attribute Dataset for Balanced Race, Gender, and Age for Bias Measurement and Mitigation},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
month = {January},
year = {2021},
pages = {1548-1558}
}
``` | ryanramos/fairface | [
"license:cc-by-4.0",
"region:us"
] | 2024-01-16T05:20:31+00:00 | {"license": "cc-by-4.0", "dataset_info": [{"config_name": "margin025", "features": [{"name": "file", "dtype": "image"}, {"name": "age", "dtype": "string"}, {"name": "gender", "dtype": "string"}, {"name": "race", "dtype": "string"}, {"name": "service_test", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 513550350.352, "num_examples": 86744}, {"name": "val", "num_bytes": 64534471.096, "num_examples": 10954}], "download_size": 563297165, "dataset_size": 578084821.448}, {"config_name": "margin125", "features": [{"name": "file", "dtype": "image"}, {"name": "age", "dtype": "string"}, {"name": "gender", "dtype": "string"}, {"name": "race", "dtype": "string"}, {"name": "service_test", "dtype": "bool"}, {"name": "image_features", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 2048362216.0, "num_examples": 86744}, {"name": "val", "num_bytes": 259645815.75, "num_examples": 10954}], "download_size": 2403949938, "dataset_size": 2308008031.75}], "configs": [{"config_name": "margin025", "data_files": [{"split": "train", "path": "margin025/train-*"}, {"split": "val", "path": "margin025/val-*"}]}, {"config_name": "margin125", "data_files": [{"split": "train", "path": "margin125/train-*"}, {"split": "val", "path": "margin125/val-*"}]}]} | 2024-01-31T16:55:47+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
|
# Dataset Card for FairFace
## Table of Contents
- Dataset Card Creation Guide
- Table of Contents
- Dataset Description
- Dataset Summary
- Dataset Structure
- Data Instances
- Data Fields
- Additional Information
- Licensing Information
- Citation Information
## Dataset Description
- Repository: URL
- Paper: FairFace: Face Attribute Dataset for Balanced Race, Gender, and Age for Bias Measurement and Mitigation
### Dataset Summary
A dataset of human faces annotated with discrete categories for the photographed person's age, sex, and race. Please consider prioritizing a previously created Hugging Face dataset repository for Fair Face as this new dataset repository was only made for downloading issues that may already be resolved.
For complete details on the dataset's construction and intended uses, please refer to the dataset's official repository or paper.
## Dataset Structure
### Data Instances
Each instance contains an image and discrete categories for age, gender, and race.
### Data Fields
- 'file': an image of a human face with either padding = 0.25 or padding = 1.25 depending on the dataset config
- 'age': a category describing the age of the person in the image limited to *0-2*, *3-9*, *10-19*, *20-29*, *30-39*, *40-49*, *50-59*, *60-69*, and *more than 70*
- 'gender': a category describing the sex of the person in the image limited to *Male* and *Female*
- 'race': a category describing the race of the person in the image limited to *East Asian*, *Indian*, *Black*, *White*, *Middle Eastern*, *Latino_Hispanic*, and *Southeast Asian*
- 'service_test': please refer to this issue from the dataset's official repository
## Additional Information
### Licensing Information
According to the official repository, FairFace is licensed under CC BY 4.0.
| [
"# Dataset Card for FairFace",
"## Table of Contents\n- Dataset Card Creation Guide\n - Table of Contents\n - Dataset Description\n - Dataset Summary\n - Dataset Structure\n - Data Instances\n - Data Fields\n - Additional Information\n - Licensing Information\n - Citation Information",
"## Dataset Description\n\n- Repository: URL\n- Paper: FairFace: Face Attribute Dataset for Balanced Race, Gender, and Age for Bias Measurement and Mitigation",
"### Dataset Summary\n\nA dataset of human faces annotated with discrete categories for the photographed person's age, sex, and race. Please consider prioritizing a previously created Hugging Face dataset repository for Fair Face as this new dataset repository was only made for downloading issues that may already be resolved.\n\nFor complete details on the dataset's construction and intended uses, please refer to the dataset's official repository or paper.",
"## Dataset Structure",
"### Data Instances\n\nEach instance contains an image and discrete categories for age, gender, and race.",
"### Data Fields\n\n- 'file': an image of a human face with either padding = 0.25 or padding = 1.25 depending on the dataset config\n- 'age': a category describing the age of the person in the image limited to *0-2*, *3-9*, *10-19*, *20-29*, *30-39*, *40-49*, *50-59*, *60-69*, and *more than 70*\n- 'gender': a category describing the sex of the person in the image limited to *Male* and *Female*\n- 'race': a category describing the race of the person in the image limited to *East Asian*, *Indian*, *Black*, *White*, *Middle Eastern*, *Latino_Hispanic*, and *Southeast Asian*\n- 'service_test': please refer to this issue from the dataset's official repository",
"## Additional Information",
"### Licensing Information\n\nAccording to the official repository, FairFace is licensed under CC BY 4.0."
] | [
"TAGS\n#license-cc-by-4.0 #region-us \n",
"# Dataset Card for FairFace",
"## Table of Contents\n- Dataset Card Creation Guide\n - Table of Contents\n - Dataset Description\n - Dataset Summary\n - Dataset Structure\n - Data Instances\n - Data Fields\n - Additional Information\n - Licensing Information\n - Citation Information",
"## Dataset Description\n\n- Repository: URL\n- Paper: FairFace: Face Attribute Dataset for Balanced Race, Gender, and Age for Bias Measurement and Mitigation",
"### Dataset Summary\n\nA dataset of human faces annotated with discrete categories for the photographed person's age, sex, and race. Please consider prioritizing a previously created Hugging Face dataset repository for Fair Face as this new dataset repository was only made for downloading issues that may already be resolved.\n\nFor complete details on the dataset's construction and intended uses, please refer to the dataset's official repository or paper.",
"## Dataset Structure",
"### Data Instances\n\nEach instance contains an image and discrete categories for age, gender, and race.",
"### Data Fields\n\n- 'file': an image of a human face with either padding = 0.25 or padding = 1.25 depending on the dataset config\n- 'age': a category describing the age of the person in the image limited to *0-2*, *3-9*, *10-19*, *20-29*, *30-39*, *40-49*, *50-59*, *60-69*, and *more than 70*\n- 'gender': a category describing the sex of the person in the image limited to *Male* and *Female*\n- 'race': a category describing the race of the person in the image limited to *East Asian*, *Indian*, *Black*, *White*, *Middle Eastern*, *Latino_Hispanic*, and *Southeast Asian*\n- 'service_test': please refer to this issue from the dataset's official repository",
"## Additional Information",
"### Licensing Information\n\nAccording to the official repository, FairFace is licensed under CC BY 4.0."
] |
dd9f97021c5afd2aac0dab0cad539c95dd082cc3 |
# Dataset Card for Evaluation run of stanford-oval/Llama-2-7b-WikiChat-fused
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [stanford-oval/Llama-2-7b-WikiChat-fused](https://huggingface.co/stanford-oval/Llama-2-7b-WikiChat-fused) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T05:29:49.524111](https://huggingface.co/datasets/open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused/blob/main/results_2024-01-16T05-29-49.524111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3993615462452277,
"acc_stderr": 0.03426948672248777,
"acc_norm": 0.4047541011703342,
"acc_norm_stderr": 0.035187845841979204,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.01618574435514492,
"mc2": 0.46362057203759177,
"mc2_stderr": 0.015792473170958256
},
"harness|arc:challenge|25": {
"acc": 0.47696245733788395,
"acc_stderr": 0.014595873205358262,
"acc_norm": 0.5068259385665529,
"acc_norm_stderr": 0.014610029151379813
},
"harness|hellaswag|10": {
"acc": 0.5690101573391755,
"acc_stderr": 0.00494202620027959,
"acc_norm": 0.7499502091216889,
"acc_norm_stderr": 0.004321564303822422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4290322580645161,
"acc_stderr": 0.028156036538233217,
"acc_norm": 0.4290322580645161,
"acc_norm_stderr": 0.028156036538233217
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617715,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5492227979274611,
"acc_stderr": 0.03590910952235524,
"acc_norm": 0.5492227979274611,
"acc_norm_stderr": 0.03590910952235524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4153846153846154,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.4153846153846154,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.025787874220959316,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.025787874220959316
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969654,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5247706422018349,
"acc_stderr": 0.021410999753635914,
"acc_norm": 0.5247706422018349,
"acc_norm_stderr": 0.021410999753635914
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4388185654008439,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.4388185654008439,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4462809917355372,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.4462809917355372,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.038818912133343826,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.038818912133343826
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5811965811965812,
"acc_stderr": 0.03232128912157792,
"acc_norm": 0.5811965811965812,
"acc_norm_stderr": 0.03232128912157792
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5759897828863346,
"acc_stderr": 0.017672263329084222,
"acc_norm": 0.5759897828863346,
"acc_norm_stderr": 0.017672263329084222
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.025992472029306376,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.025992472029306376
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28044692737430166,
"acc_stderr": 0.01502408388332289,
"acc_norm": 0.28044692737430166,
"acc_norm_stderr": 0.01502408388332289
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.028213504177824103,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.028213504177824103
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.44694533762057875,
"acc_stderr": 0.028237769422085335,
"acc_norm": 0.44694533762057875,
"acc_norm_stderr": 0.028237769422085335
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3487654320987654,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.3487654320987654,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503786,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503786
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3200782268578879,
"acc_stderr": 0.011914791947638519,
"acc_norm": 0.3200782268578879,
"acc_norm_stderr": 0.011914791947638519
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.34967320261437906,
"acc_stderr": 0.019291961895066365,
"acc_norm": 0.34967320261437906,
"acc_norm_stderr": 0.019291961895066365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3469387755102041,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.3469387755102041,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5422885572139303,
"acc_stderr": 0.035228658640995975,
"acc_norm": 0.5422885572139303,
"acc_norm_stderr": 0.035228658640995975
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.0371172519074075,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.0371172519074075
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5906432748538012,
"acc_stderr": 0.03771283107626544,
"acc_norm": 0.5906432748538012,
"acc_norm_stderr": 0.03771283107626544
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.01618574435514492,
"mc2": 0.46362057203759177,
"mc2_stderr": 0.015792473170958256
},
"harness|winogrande|5": {
"acc": 0.6906077348066298,
"acc_stderr": 0.012991329330822999
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225212
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused | [
"region:us"
] | 2024-01-16T05:32:10+00:00 | {"pretty_name": "Evaluation run of stanford-oval/Llama-2-7b-WikiChat-fused", "dataset_summary": "Dataset automatically created during the evaluation run of model [stanford-oval/Llama-2-7b-WikiChat-fused](https://huggingface.co/stanford-oval/Llama-2-7b-WikiChat-fused) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T05:29:49.524111](https://huggingface.co/datasets/open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused/blob/main/results_2024-01-16T05-29-49.524111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3993615462452277,\n \"acc_stderr\": 0.03426948672248777,\n \"acc_norm\": 0.4047541011703342,\n \"acc_norm_stderr\": 0.035187845841979204,\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.01618574435514492,\n \"mc2\": 0.46362057203759177,\n \"mc2_stderr\": 0.015792473170958256\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.47696245733788395,\n \"acc_stderr\": 0.014595873205358262,\n \"acc_norm\": 0.5068259385665529,\n \"acc_norm_stderr\": 0.014610029151379813\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5690101573391755,\n \"acc_stderr\": 0.00494202620027959,\n \"acc_norm\": 0.7499502091216889,\n \"acc_norm_stderr\": 0.004321564303822422\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.030365050829115208,\n \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.030365050829115208\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.3179190751445087,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4290322580645161,\n \"acc_stderr\": 0.028156036538233217,\n \"acc_norm\": 0.4290322580645161,\n \"acc_norm_stderr\": 0.028156036538233217\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617715,\n \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398393,\n \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398393\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4898989898989899,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5492227979274611,\n \"acc_stderr\": 0.03590910952235524,\n \"acc_norm\": 0.5492227979274611,\n \"acc_norm_stderr\": 0.03590910952235524\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4153846153846154,\n \"acc_stderr\": 0.02498535492310234,\n \"acc_norm\": 0.4153846153846154,\n \"acc_norm_stderr\": 0.02498535492310234\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959316,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959316\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.032104790510157764,\n \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.032104790510157764\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969654,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5247706422018349,\n \"acc_stderr\": 0.021410999753635914,\n \"acc_norm\": 0.5247706422018349,\n \"acc_norm_stderr\": 0.021410999753635914\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.03492406104163613,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.03492406104163613\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4388185654008439,\n \"acc_stderr\": 0.032302649315470375,\n \"acc_norm\": 0.4388185654008439,\n \"acc_norm_stderr\": 0.032302649315470375\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262971,\n \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262971\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4462809917355372,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\": 0.4462809917355372,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.038818912133343826,\n \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.038818912133343826\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5811965811965812,\n \"acc_stderr\": 0.03232128912157792,\n \"acc_norm\": 0.5811965811965812,\n \"acc_norm_stderr\": 0.03232128912157792\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5759897828863346,\n \"acc_stderr\": 0.017672263329084222,\n \"acc_norm\": 0.5759897828863346,\n \"acc_norm_stderr\": 0.017672263329084222\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.025992472029306376,\n \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.025992472029306376\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28044692737430166,\n \"acc_stderr\": 0.01502408388332289,\n \"acc_norm\": 0.28044692737430166,\n \"acc_norm_stderr\": 0.01502408388332289\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.028213504177824103,\n \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.028213504177824103\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.44694533762057875,\n \"acc_stderr\": 0.028237769422085335,\n \"acc_norm\": 0.44694533762057875,\n \"acc_norm_stderr\": 0.028237769422085335\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3487654320987654,\n \"acc_stderr\": 0.02651759772446501,\n \"acc_norm\": 0.3487654320987654,\n \"acc_norm_stderr\": 0.02651759772446501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503786,\n \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503786\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3200782268578879,\n \"acc_stderr\": 0.011914791947638519,\n \"acc_norm\": 0.3200782268578879,\n \"acc_norm_stderr\": 0.011914791947638519\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.34967320261437906,\n \"acc_stderr\": 0.019291961895066365,\n \"acc_norm\": 0.34967320261437906,\n \"acc_norm_stderr\": 0.019291961895066365\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3469387755102041,\n \"acc_stderr\": 0.030472526026726496,\n \"acc_norm\": 0.3469387755102041,\n \"acc_norm_stderr\": 0.030472526026726496\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5422885572139303,\n \"acc_stderr\": 0.035228658640995975,\n \"acc_norm\": 0.5422885572139303,\n \"acc_norm_stderr\": 0.035228658640995975\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5906432748538012,\n \"acc_stderr\": 0.03771283107626544,\n \"acc_norm\": 0.5906432748538012,\n \"acc_norm_stderr\": 0.03771283107626544\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.01618574435514492,\n \"mc2\": 0.46362057203759177,\n \"mc2_stderr\": 0.015792473170958256\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6906077348066298,\n \"acc_stderr\": 0.012991329330822999\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225212\n }\n}\n```", "repo_url": "https://huggingface.co/stanford-oval/Llama-2-7b-WikiChat-fused", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|arc:challenge|25_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|gsm8k|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hellaswag|10_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T05-29-49.524111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["**/details_harness|winogrande|5_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T05-29-49.524111.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T05_29_49.524111", "path": ["results_2024-01-16T05-29-49.524111.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T05-29-49.524111.parquet"]}]}]} | 2024-01-16T05:32:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of stanford-oval/Llama-2-7b-WikiChat-fused
Dataset automatically created during the evaluation run of model stanford-oval/Llama-2-7b-WikiChat-fused on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T05:29:49.524111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of stanford-oval/Llama-2-7b-WikiChat-fused\n\n\n\nDataset automatically created during the evaluation run of model stanford-oval/Llama-2-7b-WikiChat-fused on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T05:29:49.524111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of stanford-oval/Llama-2-7b-WikiChat-fused\n\n\n\nDataset automatically created during the evaluation run of model stanford-oval/Llama-2-7b-WikiChat-fused on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T05:29:49.524111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f425a3b3c1f32149821f1a7a456ff019b96b58b2 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | jw0303/123 | [
"region:us"
] | 2024-01-16T05:59:17+00:00 | {} | 2024-01-16T06:01:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c538ca82041daca7210a3ca626e48fcabd45bdb7 |
# Dataset Card for Evaluation run of FelixChao/ExtremeDolphin-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/ExtremeDolphin-MoE](https://huggingface.co/FelixChao/ExtremeDolphin-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__ExtremeDolphin-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T06:21:06.917044](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__ExtremeDolphin-MoE/blob/main/results_2024-01-16T06-21-06.917044.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.640534843831089,
"acc_stderr": 0.032262447071264584,
"acc_norm": 0.6415654928978998,
"acc_norm_stderr": 0.032914024111090026,
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5727996110770869,
"mc2_stderr": 0.015406323461664524
},
"harness|arc:challenge|25": {
"acc": 0.6177474402730375,
"acc_stderr": 0.014200454049979279,
"acc_norm": 0.6510238907849829,
"acc_norm_stderr": 0.013928933461382501
},
"harness|hellaswag|10": {
"acc": 0.6684923322047401,
"acc_stderr": 0.004697929774670295,
"acc_norm": 0.8606851224855606,
"acc_norm_stderr": 0.003455671196993115
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473075,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431353,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431353
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.01273239828619044,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.01273239828619044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.01911721391149515,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.01911721391149515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5727996110770869,
"mc2_stderr": 0.015406323461664524
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722757
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.01305911193583151
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__ExtremeDolphin-MoE | [
"region:us"
] | 2024-01-16T06:23:23+00:00 | {"pretty_name": "Evaluation run of FelixChao/ExtremeDolphin-MoE", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/ExtremeDolphin-MoE](https://huggingface.co/FelixChao/ExtremeDolphin-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__ExtremeDolphin-MoE\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T06:21:06.917044](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__ExtremeDolphin-MoE/blob/main/results_2024-01-16T06-21-06.917044.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.640534843831089,\n \"acc_stderr\": 0.032262447071264584,\n \"acc_norm\": 0.6415654928978998,\n \"acc_norm_stderr\": 0.032914024111090026,\n \"mc1\": 0.39412484700122397,\n \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5727996110770869,\n \"mc2_stderr\": 0.015406323461664524\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979279,\n \"acc_norm\": 0.6510238907849829,\n \"acc_norm_stderr\": 0.013928933461382501\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6684923322047401,\n \"acc_stderr\": 0.004697929774670295,\n \"acc_norm\": 0.8606851224855606,\n \"acc_norm_stderr\": 0.003455671196993115\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473075,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431353,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431353\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.01911721391149515,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.01911721391149515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39412484700122397,\n \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5727996110770869,\n \"mc2_stderr\": 0.015406323461664524\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722757\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \"acc_stderr\": 0.01305911193583151\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/ExtremeDolphin-MoE", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|arc:challenge|25_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|gsm8k|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hellaswag|10_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T06-21-06.917044.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["**/details_harness|winogrande|5_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T06-21-06.917044.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T06_21_06.917044", "path": ["results_2024-01-16T06-21-06.917044.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T06-21-06.917044.parquet"]}]}]} | 2024-01-16T06:23:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/ExtremeDolphin-MoE
Dataset automatically created during the evaluation run of model FelixChao/ExtremeDolphin-MoE on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T06:21:06.917044(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FelixChao/ExtremeDolphin-MoE\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/ExtremeDolphin-MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T06:21:06.917044(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/ExtremeDolphin-MoE\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/ExtremeDolphin-MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T06:21:06.917044(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
ecbe5c5d19039b75bba4fba81109f40ed4fb4c4a |
Revisions to `alphalm/gt1_8kElo_all_tokenized`:
- model_max_length: 8192 v0 -> 4096 v1
- Only add eos_token to checkmate games | alphalm/gt1_8kElo_all_tokenized-v1 | [
"license:mit",
"region:us"
] | 2024-01-16T06:23:33+00:00 | {"license": "mit"} | 2024-01-21T20:46:20+00:00 | [] | [] | TAGS
#license-mit #region-us
|
Revisions to 'alphalm/gt1_8kElo_all_tokenized':
- model_max_length: 8192 v0 -> 4096 v1
- Only add eos_token to checkmate games | [] | [
"TAGS\n#license-mit #region-us \n"
] |
70e42bb972cb636494dd9ec1075287cbb8b147fd |
# Dataset of hatakaze (Kantai Collection)
This is the dataset of hatakaze (Kantai Collection), containing 324 images and their tags.
The core tags of this character are `drill_hair, ponytail, blue_eyes, light_brown_hair, hair_between_eyes, ribbon, brown_hair, hair_ribbon, hair_ornament, red_ribbon, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 324 | 335.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 324 | 219.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 746 | 453.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 324 | 308.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 746 | 599.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatakaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hatakaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, haori, meiji_schoolgirl_uniform, simple_background, solo, upper_body, white_background, white_kimono, looking_at_viewer, smile, blush, furisode |
| 1 | 7 |  |  |  |  |  | 1girl, black_hakama, haori, meiji_schoolgirl_uniform, simple_background, solo, white_background, white_kimono, hakama_short_skirt, looking_at_viewer, smile, blush, furisode |
| 2 | 10 |  |  |  |  |  | detached_collar, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, 1girl, cleavage, medium_breasts, simple_background, solo, strapless_leotard, wrist_cuffs, bowtie, alternate_costume, long_hair, white_background, black_pantyhose, blush, cowboy_shot, rabbit_tail, full_body, high_heels, thighhighs, yellow_leotard |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | haori | meiji_schoolgirl_uniform | simple_background | solo | upper_body | white_background | white_kimono | looking_at_viewer | smile | blush | furisode | black_hakama | hakama_short_skirt | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | cleavage | medium_breasts | strapless_leotard | wrist_cuffs | bowtie | alternate_costume | long_hair | black_pantyhose | cowboy_shot | rabbit_tail | full_body | high_heels | thighhighs | yellow_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------------------------|:--------------------|:-------|:-------------|:-------------------|:---------------|:--------------------|:--------|:--------|:-----------|:---------------|:---------------------|:------------------|:-------------------|:----------------|:--------------|:-----------|:-----------------|:--------------------|:--------------|:---------|:--------------------|:------------|:------------------|:--------------|:--------------|:------------|:-------------|:-------------|:-----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | | X | X | | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hatakaze_kantaicollection | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-16T06:34:55+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-16T07:34:37+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of hatakaze (Kantai Collection)
=======================================
This is the dataset of hatakaze (Kantai Collection), containing 324 images and their tags.
The core tags of this character are 'drill\_hair, ponytail, blue\_eyes, light\_brown\_hair, hair\_between\_eyes, ribbon, brown\_hair, hair\_ribbon, hair\_ornament, red\_ribbon, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
d34c0b3c82cc25267b29dd521480cc35ba92871b |
# Apache-2.0 + WhiteRabbitNeo Extended Version
# Licence: Usage Restrictions
```
You agree not to use the Model or Derivatives of the Model:
- In any way that violates any applicable national or international law or regulation or infringes upon the lawful rights and interests of any third party;
- For military use in any way;
- For the purpose of exploiting, harming or attempting to exploit or harm minors in any way;
- To generate or disseminate verifiably false information and/or content with the purpose of harming others;
- To generate or disseminate inappropriate content subject to applicable regulatory requirements;
- To generate or disseminate personal identifiable information without due authorization or for unreasonable use;
- To defame, disparage or otherwise harass others;
- For fully automated decision making that adversely impacts an individual’s legal rights or otherwise creates or modifies a binding, enforceable obligation;
- For any use intended to or which has the effect of discriminating against or harming individuals or groups based on online or offline social behavior or known or predicted personal or personality characteristics;
- To exploit any of the vulnerabilities of a specific group of persons based on their age, social, physical or mental characteristics, in order to materially distort the behavior of a person pertaining to that group in a manner that causes or is likely to cause that person or another person physical or psychological harm;
- For any use intended to or which has the effect of discriminating against individuals or groups based on legally protected characteristics or categories.
``` | WhiteRabbitNeo/WRN-Chapter-2 | [
"license:other",
"region:us"
] | 2024-01-16T06:37:17+00:00 | {"license": "other"} | 2024-01-16T16:05:33+00:00 | [] | [] | TAGS
#license-other #region-us
|
# Apache-2.0 + WhiteRabbitNeo Extended Version
# Licence: Usage Restrictions
| [
"# Apache-2.0 + WhiteRabbitNeo Extended Version",
"# Licence: Usage Restrictions"
] | [
"TAGS\n#license-other #region-us \n",
"# Apache-2.0 + WhiteRabbitNeo Extended Version",
"# Licence: Usage Restrictions"
] |
2ff93dcff559173e7a39c6fe6478c2967213b78b |
# Dataset of uranami (Kantai Collection)
This is the dataset of uranami (Kantai Collection), containing 214 images and their tags.
The core tags of this character are `long_hair, brown_hair, braid, brown_eyes, single_braid, hair_over_shoulder`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 214 | 135.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uranami_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 214 | 99.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uranami_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 386 | 178.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uranami_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 214 | 126.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uranami_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 386 | 220.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uranami_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/uranami_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blue_sailor_collar, blue_skirt, looking_at_viewer, pleated_skirt, serafuku, simple_background, solo, white_background, neckerchief, cowboy_shot, one-hour_drawing_challenge, open_mouth, dated, twitter_username, smile |
| 1 | 8 |  |  |  |  |  | blue_pants, camera, white_shirt, 1girl, belt, bag, jeans, open_mouth, solo, alternate_costume, cowboy_shot, looking_at_viewer, short_sleeves, smile, blush, holding, simple_background, white_background |
| 2 | 7 |  |  |  |  |  | 1girl, blush, solo, underwear_only, bangs, looking_at_viewer, white_panties, navel, open_mouth, small_breasts, white_bra, collarbone |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_sailor_collar | blue_skirt | looking_at_viewer | pleated_skirt | serafuku | simple_background | solo | white_background | neckerchief | cowboy_shot | one-hour_drawing_challenge | open_mouth | dated | twitter_username | smile | blue_pants | camera | white_shirt | belt | bag | jeans | alternate_costume | short_sleeves | blush | holding | underwear_only | bangs | white_panties | navel | small_breasts | white_bra | collarbone |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:-------------|:--------------------|:----------------|:-----------|:--------------------|:-------|:-------------------|:--------------|:--------------|:-----------------------------|:-------------|:--------|:-------------------|:--------|:-------------|:---------|:--------------|:-------|:------|:--------|:--------------------|:----------------|:--------|:----------|:-----------------|:--------|:----------------|:--------|:----------------|:------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | | X | | | X | X | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | | | | X | | | | | X | | | | | | | | | | | | X | | X | X | X | X | X | X | X |
| CyberHarem/uranami_kantaicollection | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-16T07:07:31+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-16T07:43:54+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of uranami (Kantai Collection)
======================================
This is the dataset of uranami (Kantai Collection), containing 214 images and their tags.
The core tags of this character are 'long\_hair, brown\_hair, braid, brown\_eyes, single\_braid, hair\_over\_shoulder', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
61a3ce5f722add2262b63547c94f48b4b7ef9b5d |
Please refer to our [GitHub repo](https://github.com/GAIR-NLP/ReAlign) for more details. | GAIR/ReAlign-Open-Platypus | [
"task_categories:question-answering",
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:en",
"region:us"
] | 2024-01-16T07:23:18+00:00 | {"language": ["en"], "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "conversational"]} | 2024-01-16T07:25:52+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-conversational #size_categories-10K<n<100K #language-English #region-us
|
Please refer to our GitHub repo for more details. | [] | [
"TAGS\n#task_categories-question-answering #task_categories-conversational #size_categories-10K<n<100K #language-English #region-us \n"
] |
8d00ec38ff7546a66248168b5fcdf545cc9038d4 |
Please refer to our [GitHub repo](https://github.com/GAIR-NLP/ReAlign) for more details. | GAIR/ReAlign-Alpaca | [
"task_categories:question-answering",
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:en",
"region:us"
] | 2024-01-16T07:27:05+00:00 | {"language": ["en"], "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "conversational"]} | 2024-01-16T07:28:10+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-conversational #size_categories-10K<n<100K #language-English #region-us
|
Please refer to our GitHub repo for more details. | [] | [
"TAGS\n#task_categories-question-answering #task_categories-conversational #size_categories-10K<n<100K #language-English #region-us \n"
] |
367a0146f3a065675172813524e36e7b3e15a163 |
Please refer to our [GitHub repo](https://github.com/GAIR-NLP/ReAlign) for more details. | GAIR/ReAlign-No-Robots | [
"task_categories:question-answering",
"task_categories:conversational",
"size_categories:1K<n<10K",
"language:en",
"region:us"
] | 2024-01-16T07:28:46+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "conversational"]} | 2024-01-16T07:29:29+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-conversational #size_categories-1K<n<10K #language-English #region-us
|
Please refer to our GitHub repo for more details. | [] | [
"TAGS\n#task_categories-question-answering #task_categories-conversational #size_categories-1K<n<10K #language-English #region-us \n"
] |
57d3da0474d4d2a4edc41639869373171660ac6c | # Baidu ULTR Dataset - UvA BERT-12l-12h
Query-document vectors and clicks for a subset of the [Baidu Unbiased Learning to Rank
dataset](https://arxiv.org/abs/2207.03051).
This dataset uses a BERT cross-encoder with 12 layers trained on a Masked Language Modeling (MLM) and click-through-rate (CTR) prediction task to compute query-document vectors (768 dims).
The model is available under `model/`.
## Setup
1. Install huggingface [datasets](https://huggingface.co/docs/datasets/installation)
2. Install [pandas](https://github.com/pandas-dev/pandas) and [pyarrow](https://arrow.apache.org/docs/python/index.html): `pip install pandas pyarrow`
3. Optionally, you might need to install a [pyarrow-hotfix](https://github.com/pitrou/pyarrow-hotfix) if you cannot install `pyarrow >= 14.0.1`
4. You can now use the dataset as described below.
## Load train / test click dataset:
```Python
from datasets import load_dataset
dataset = load_dataset(
"philipphager/baidu-ultr_uva-mlm-ctr",
name="clicks",
split="train", # ["train", "test"]
cache_dir="~/.cache/huggingface",
)
dataset.set_format("torch") # [None, "numpy", "torch", "tensorflow", "pandas", "arrow"]
```
## Load expert annotations:
```Python
from datasets import load_dataset
dataset = load_dataset(
"philipphager/baidu-ultr_uva-mlm-ctr",
name="annotations",
split="test",
cache_dir="~/.cache/huggingface",
)
dataset.set_format("torch") # [None, "numpy", "torch", "tensorflow", "pandas", "arrow"]
```
## Available features
Each row of the click / annotation dataset contains the following attributes. Use a custom `collate_fn` to select specific features (see below):
### Click dataset
| name | dtype | description |
|------------------------------|----------------|-------------|
| query_id | string | Baidu query_id |
| query_md5 | string | MD5 hash of query text |
| url_md5 | List[string] | MD5 hash of document url, most reliable document identifier |
| text_md5 | List[string] | MD5 hash of document title and abstract |
| query_document_embedding | Tensor[float16]| BERT CLS token |
| click | Tensor[int32] | Click / no click on a document |
| n | int32 | Number of documents for current query, useful for padding |
| position | Tensor[int32] | Position in ranking (does not always match original item position) |
| media_type | Tensor[int32] | Document type (label encoding recommended as ids do not occupy a continous integer range) |
| displayed_time | Tensor[float32]| Seconds a document was displayed on screen |
| serp_height | Tensor[int32] | Pixel height of a document on screen |
| slipoff_count_after_click | Tensor[int32] | Number of times a document was scrolled off screen after previously clicking on it |
### Expert annotation dataset
| name | dtype | description |
|------------------------------|----------------|-------------|
| query_id | string | Baidu query_id |
| query_md5 | string | MD5 hash of query text |
| text_md5 | List[string] | MD5 hash of document title and abstract |
| query_document_embedding | Tensor[float16]| BERT CLS token |
| label | Tensor[int32] | Relevance judgment on a scale from 0 (bad) to 4 (excellent) |
| n | int32 | Number of documents for current query, useful for padding |
| frequency_bucket | int32 | Monthly frequency of query (bucket) from 0 (high frequency) to 9 (low frequency) |
## Example PyTorch collate function
Each sample in the dataset is a single query with multiple documents.
The following example demonstrates how to create a batch containing multiple queries with varying numbers of documents by applying padding:
```Python
import torch
from typing import List
from collections import defaultdict
from torch.nn.utils.rnn import pad_sequence
from torch.utils.data import DataLoader
def collate_clicks(samples: List):
batch = defaultdict(lambda: [])
for sample in samples:
batch["query_document_embedding"].append(sample["query_document_embedding"])
batch["position"].append(sample["position"])
batch["click"].append(sample["click"])
batch["n"].append(sample["n"])
return {
"query_document_embedding": pad_sequence(batch["query_document_embedding"], batch_first=True),
"position": pad_sequence(batch["position"], batch_first=True),
"click": pad_sequence(batch["click"], batch_first=True),
"n": torch.tensor(batch["n"]),
}
loader = DataLoader(dataset, collate_fn=collate_clicks, batch_size=16)
```
| philipphager/baidu-ultr_uva-mlm-ctr | [
"license:cc-by-nc-4.0",
"arxiv:2207.03051",
"region:us"
] | 2024-01-16T07:51:30+00:00 | {"license": "cc-by-nc-4.0", "viewer": false} | 2024-02-08T06:15:48+00:00 | [
"2207.03051"
] | [] | TAGS
#license-cc-by-nc-4.0 #arxiv-2207.03051 #region-us
| Baidu ULTR Dataset - UvA BERT-12l-12h
=====================================
Query-document vectors and clicks for a subset of the Baidu Unbiased Learning to Rank
dataset.
This dataset uses a BERT cross-encoder with 12 layers trained on a Masked Language Modeling (MLM) and click-through-rate (CTR) prediction task to compute query-document vectors (768 dims).
The model is available under 'model/'.
Setup
-----
1. Install huggingface datasets
2. Install pandas and pyarrow: 'pip install pandas pyarrow'
3. Optionally, you might need to install a pyarrow-hotfix if you cannot install 'pyarrow >= 14.0.1'
4. You can now use the dataset as described below.
Load train / test click dataset:
--------------------------------
Load expert annotations:
------------------------
Available features
------------------
Each row of the click / annotation dataset contains the following attributes. Use a custom 'collate\_fn' to select specific features (see below):
### Click dataset
name: query\_id, dtype: string, description: Baidu query\_id
name: query\_md5, dtype: string, description: MD5 hash of query text
name: url\_md5, dtype: List[string], description: MD5 hash of document url, most reliable document identifier
name: text\_md5, dtype: List[string], description: MD5 hash of document title and abstract
name: query\_document\_embedding, dtype: Tensor[float16], description: BERT CLS token
name: click, dtype: Tensor[int32], description: Click / no click on a document
name: n, dtype: int32, description: Number of documents for current query, useful for padding
name: position, dtype: Tensor[int32], description: Position in ranking (does not always match original item position)
name: media\_type, dtype: Tensor[int32], description: Document type (label encoding recommended as ids do not occupy a continous integer range)
name: displayed\_time, dtype: Tensor[float32], description: Seconds a document was displayed on screen
name: serp\_height, dtype: Tensor[int32], description: Pixel height of a document on screen
name: slipoff\_count\_after\_click, dtype: Tensor[int32], description: Number of times a document was scrolled off screen after previously clicking on it
### Expert annotation dataset
name: query\_id, dtype: string, description: Baidu query\_id
name: query\_md5, dtype: string, description: MD5 hash of query text
name: text\_md5, dtype: List[string], description: MD5 hash of document title and abstract
name: query\_document\_embedding, dtype: Tensor[float16], description: BERT CLS token
name: label, dtype: Tensor[int32], description: Relevance judgment on a scale from 0 (bad) to 4 (excellent)
name: n, dtype: int32, description: Number of documents for current query, useful for padding
name: frequency\_bucket, dtype: int32, description: Monthly frequency of query (bucket) from 0 (high frequency) to 9 (low frequency)
Example PyTorch collate function
--------------------------------
Each sample in the dataset is a single query with multiple documents.
The following example demonstrates how to create a batch containing multiple queries with varying numbers of documents by applying padding:
| [
"### Click dataset\n\n\nname: query\\_id, dtype: string, description: Baidu query\\_id\nname: query\\_md5, dtype: string, description: MD5 hash of query text\nname: url\\_md5, dtype: List[string], description: MD5 hash of document url, most reliable document identifier\nname: text\\_md5, dtype: List[string], description: MD5 hash of document title and abstract\nname: query\\_document\\_embedding, dtype: Tensor[float16], description: BERT CLS token\nname: click, dtype: Tensor[int32], description: Click / no click on a document\nname: n, dtype: int32, description: Number of documents for current query, useful for padding\nname: position, dtype: Tensor[int32], description: Position in ranking (does not always match original item position)\nname: media\\_type, dtype: Tensor[int32], description: Document type (label encoding recommended as ids do not occupy a continous integer range)\nname: displayed\\_time, dtype: Tensor[float32], description: Seconds a document was displayed on screen\nname: serp\\_height, dtype: Tensor[int32], description: Pixel height of a document on screen\nname: slipoff\\_count\\_after\\_click, dtype: Tensor[int32], description: Number of times a document was scrolled off screen after previously clicking on it",
"### Expert annotation dataset\n\n\nname: query\\_id, dtype: string, description: Baidu query\\_id\nname: query\\_md5, dtype: string, description: MD5 hash of query text\nname: text\\_md5, dtype: List[string], description: MD5 hash of document title and abstract\nname: query\\_document\\_embedding, dtype: Tensor[float16], description: BERT CLS token\nname: label, dtype: Tensor[int32], description: Relevance judgment on a scale from 0 (bad) to 4 (excellent)\nname: n, dtype: int32, description: Number of documents for current query, useful for padding\nname: frequency\\_bucket, dtype: int32, description: Monthly frequency of query (bucket) from 0 (high frequency) to 9 (low frequency)\n\n\nExample PyTorch collate function\n--------------------------------\n\n\nEach sample in the dataset is a single query with multiple documents.\nThe following example demonstrates how to create a batch containing multiple queries with varying numbers of documents by applying padding:"
] | [
"TAGS\n#license-cc-by-nc-4.0 #arxiv-2207.03051 #region-us \n",
"### Click dataset\n\n\nname: query\\_id, dtype: string, description: Baidu query\\_id\nname: query\\_md5, dtype: string, description: MD5 hash of query text\nname: url\\_md5, dtype: List[string], description: MD5 hash of document url, most reliable document identifier\nname: text\\_md5, dtype: List[string], description: MD5 hash of document title and abstract\nname: query\\_document\\_embedding, dtype: Tensor[float16], description: BERT CLS token\nname: click, dtype: Tensor[int32], description: Click / no click on a document\nname: n, dtype: int32, description: Number of documents for current query, useful for padding\nname: position, dtype: Tensor[int32], description: Position in ranking (does not always match original item position)\nname: media\\_type, dtype: Tensor[int32], description: Document type (label encoding recommended as ids do not occupy a continous integer range)\nname: displayed\\_time, dtype: Tensor[float32], description: Seconds a document was displayed on screen\nname: serp\\_height, dtype: Tensor[int32], description: Pixel height of a document on screen\nname: slipoff\\_count\\_after\\_click, dtype: Tensor[int32], description: Number of times a document was scrolled off screen after previously clicking on it",
"### Expert annotation dataset\n\n\nname: query\\_id, dtype: string, description: Baidu query\\_id\nname: query\\_md5, dtype: string, description: MD5 hash of query text\nname: text\\_md5, dtype: List[string], description: MD5 hash of document title and abstract\nname: query\\_document\\_embedding, dtype: Tensor[float16], description: BERT CLS token\nname: label, dtype: Tensor[int32], description: Relevance judgment on a scale from 0 (bad) to 4 (excellent)\nname: n, dtype: int32, description: Number of documents for current query, useful for padding\nname: frequency\\_bucket, dtype: int32, description: Monthly frequency of query (bucket) from 0 (high frequency) to 9 (low frequency)\n\n\nExample PyTorch collate function\n--------------------------------\n\n\nEach sample in the dataset is a single query with multiple documents.\nThe following example demonstrates how to create a batch containing multiple queries with varying numbers of documents by applying padding:"
] |
dc578befd7cf36f61e8fbd2fe14a144d5e112581 | This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information.
https://huggingface.co/datasets/biglab/webui-7k
```
from datasets import load_dataset
dataset = load_dataset("biglab/webui-7k-elements")
``` | biglab/webui-7k-elements | [
"region:us"
] | 2024-01-16T07:55:40+00:00 | {} | 2024-01-23T02:34:48+00:00 | [] | [] | TAGS
#region-us
| This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information.
URL
| [] | [
"TAGS\n#region-us \n"
] |
e037ed9e49cb7f0ecb253bdbaded0b8cde385c1a |
# Dataset Card for Evaluation run of Technoculture/Medtulu-2x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/Medtulu-2x7b](https://huggingface.co/Technoculture/Medtulu-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__Medtulu-2x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T08:08:44.091130](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medtulu-2x7b/blob/main/results_2024-01-16T08-08-44.091130.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4912286252834545,
"acc_stderr": 0.03450140674623141,
"acc_norm": 0.4966099863528162,
"acc_norm_stderr": 0.035271481019980566,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276775,
"mc2": 0.500358139155482,
"mc2_stderr": 0.015732799808200134
},
"harness|arc:challenge|25": {
"acc": 0.5034129692832765,
"acc_stderr": 0.014611050403244077,
"acc_norm": 0.5460750853242321,
"acc_norm_stderr": 0.014549221105171869
},
"harness|hellaswag|10": {
"acc": 0.566122286397132,
"acc_stderr": 0.004945956744943815,
"acc_norm": 0.7568213503286197,
"acc_norm_stderr": 0.004281253317507337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.038118909889404105,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.038118909889404105
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992083,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992083
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.03292296639155142,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.03292296639155142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833713,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833713
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.020048115923415315,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.020048115923415315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936464,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936464
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.03393388584958406,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.03393388584958406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598028,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5336322869955157,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.5336322869955157,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.04356447202665069,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.04356447202665069
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.028605953702004257,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.028605953702004257
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6475095785440613,
"acc_stderr": 0.01708415024408138,
"acc_norm": 0.6475095785440613,
"acc_norm_stderr": 0.01708415024408138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.02683080599895224,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.02683080599895224
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409155,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409155
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325953,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38070404172099087,
"acc_stderr": 0.012401430654645898,
"acc_norm": 0.38070404172099087,
"acc_norm_stderr": 0.012401430654645898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43790849673202614,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.43790849673202614,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.03345563070339191,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.03345563070339191
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276775,
"mc2": 0.500358139155482,
"mc2_stderr": 0.015732799808200134
},
"harness|winogrande|5": {
"acc": 0.728492501973165,
"acc_stderr": 0.012499326254893129
},
"harness|gsm8k|5": {
"acc": 0.16982562547384383,
"acc_stderr": 0.0103425723608612
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Technoculture__Medtulu-2x7b | [
"region:us"
] | 2024-01-16T08:11:01+00:00 | {"pretty_name": "Evaluation run of Technoculture/Medtulu-2x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Technoculture/Medtulu-2x7b](https://huggingface.co/Technoculture/Medtulu-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__Medtulu-2x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T08:08:44.091130](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medtulu-2x7b/blob/main/results_2024-01-16T08-08-44.091130.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4912286252834545,\n \"acc_stderr\": 0.03450140674623141,\n \"acc_norm\": 0.4966099863528162,\n \"acc_norm_stderr\": 0.035271481019980566,\n \"mc1\": 0.34394124847001223,\n \"mc1_stderr\": 0.016629087514276775,\n \"mc2\": 0.500358139155482,\n \"mc2_stderr\": 0.015732799808200134\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5034129692832765,\n \"acc_stderr\": 0.014611050403244077,\n \"acc_norm\": 0.5460750853242321,\n \"acc_norm_stderr\": 0.014549221105171869\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.566122286397132,\n \"acc_stderr\": 0.004945956744943815,\n \"acc_norm\": 0.7568213503286197,\n \"acc_norm_stderr\": 0.004281253317507337\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.0404633688397825,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.0404633688397825\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.038118909889404105,\n \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.038118909889404105\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992083,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992083\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155142,\n \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155142\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833713,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833713\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6770642201834862,\n \"acc_stderr\": 0.020048115923415315,\n \"acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.020048115923415315\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936464,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936464\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.03393388584958406,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.03393388584958406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598028,\n \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598028\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5336322869955157,\n \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.5336322869955157,\n \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n \"acc_stderr\": 0.028605953702004257,\n \"acc_norm\": 0.7435897435897436,\n \"acc_norm_stderr\": 0.028605953702004257\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6475095785440613,\n \"acc_stderr\": 0.01708415024408138,\n \"acc_norm\": 0.6475095785440613,\n \"acc_norm_stderr\": 0.01708415024408138\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.02683080599895224,\n \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.02683080599895224\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n \"acc_stderr\": 0.014572650383409155,\n \"acc_norm\": 0.2547486033519553,\n \"acc_norm_stderr\": 0.014572650383409155\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n \"acc_stderr\": 0.027882383791325953,\n \"acc_norm\": 0.594855305466238,\n \"acc_norm_stderr\": 0.027882383791325953\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.027744313443376536,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.027744313443376536\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38070404172099087,\n \"acc_stderr\": 0.012401430654645898,\n \"acc_norm\": 0.38070404172099087,\n \"acc_norm_stderr\": 0.012401430654645898\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.43790849673202614,\n \"acc_stderr\": 0.020071257886886525,\n \"acc_norm\": 0.43790849673202614,\n \"acc_norm_stderr\": 0.020071257886886525\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n \"acc_stderr\": 0.03345563070339191,\n \"acc_norm\": 0.6616915422885572,\n \"acc_norm_stderr\": 0.03345563070339191\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.4036144578313253,\n \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n \"mc1_stderr\": 0.016629087514276775,\n \"mc2\": 0.500358139155482,\n \"mc2_stderr\": 0.015732799808200134\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893129\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16982562547384383,\n \"acc_stderr\": 0.0103425723608612\n }\n}\n```", "repo_url": "https://huggingface.co/Technoculture/Medtulu-2x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|arc:challenge|25_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|gsm8k|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hellaswag|10_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T08-08-44.091130.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["**/details_harness|winogrande|5_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T08-08-44.091130.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T08_08_44.091130", "path": ["results_2024-01-16T08-08-44.091130.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T08-08-44.091130.parquet"]}]}]} | 2024-01-16T08:11:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Technoculture/Medtulu-2x7b
Dataset automatically created during the evaluation run of model Technoculture/Medtulu-2x7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T08:08:44.091130(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Technoculture/Medtulu-2x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Medtulu-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T08:08:44.091130(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Technoculture/Medtulu-2x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Medtulu-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T08:08:44.091130(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d782e4079f7649a2a209d993f626d6baa482b508 |
This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information.
https://huggingface.co/datasets/biglab/webui-7kbal
```
from datasets import load_dataset
dataset = load_dataset("biglab/webui-7kbal-elements")
```
| biglab/webui-7kbal-elements | [
"region:us"
] | 2024-01-16T08:13:40+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "labels", "sequence": {"sequence": "string"}}, {"name": "contentBoxes", "sequence": {"sequence": "float64"}}, {"name": "paddingBoxes", "sequence": {"sequence": "float64"}}, {"name": "borderBoxes", "sequence": {"sequence": "float64"}}, {"name": "marginBoxes", "sequence": {"sequence": "float64"}}, {"name": "key_name", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1865221115.665, "num_examples": 38411}], "download_size": 1501188240, "dataset_size": 1865221115.665}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-23T02:35:27+00:00 | [] | [] | TAGS
#region-us
|
This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information.
URL
| [] | [
"TAGS\n#region-us \n"
] |
599b505a130d188ded3dab586ad31ec6be757613 | # Dataset Card for "repo-codegen-py-py-context-path-distance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jenyag/repo-codegen-py-py-context-path-distance | [
"region:us"
] | 2024-01-16T08:18:07+00:00 | {"dataset_info": {"features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 114370147, "num_examples": 224}], "download_size": 22014753, "dataset_size": 114370147}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]} | 2024-01-16T09:07:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "repo-codegen-py-py-context-path-distance"
More Information needed | [
"# Dataset Card for \"repo-codegen-py-py-context-path-distance\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"repo-codegen-py-py-context-path-distance\"\n\nMore Information needed"
] |
a2d16bc78788ebd64693594d32532fcf97273e16 | # Dataset Card for "repo-codegen-py-non-py-context-path-distance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jenyag/repo-codegen-py-non-py-context-path-distance | [
"region:us"
] | 2024-01-16T08:18:17+00:00 | {"dataset_info": {"features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 560157388, "num_examples": 224}], "download_size": 226460548, "dataset_size": 560157388}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]} | 2024-01-16T09:08:25+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "repo-codegen-py-non-py-context-path-distance"
More Information needed | [
"# Dataset Card for \"repo-codegen-py-non-py-context-path-distance\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"repo-codegen-py-non-py-context-path-distance\"\n\nMore Information needed"
] |
02fd3ae98c1bbda388ff8f133deb83a0b12f6af3 | # Dataset Card for "repo-codegen-py-all-context-path-distance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jenyag/repo-codegen-py-all-context-path-distance | [
"region:us"
] | 2024-01-16T08:18:38+00:00 | {"dataset_info": {"features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 590554966, "num_examples": 224}], "download_size": 236585246, "dataset_size": 590554966}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]} | 2024-01-16T09:08:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "repo-codegen-py-all-context-path-distance"
More Information needed | [
"# Dataset Card for \"repo-codegen-py-all-context-path-distance\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"repo-codegen-py-all-context-path-distance\"\n\nMore Information needed"
] |
35344e11e690f29a6fcd7caf09b7d5b840a163ae |
# Dataset Card for Evaluation run of Technoculture/Medorca-2x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/Medorca-2x7b](https://huggingface.co/Technoculture/Medorca-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__Medorca-2x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T08:17:44.723678](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medorca-2x7b/blob/main/results_2024-01-16T08-17-44.723678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5314920128709506,
"acc_stderr": 0.03395701268323282,
"acc_norm": 0.5369605553507105,
"acc_norm_stderr": 0.03469962839191263,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.48040816029933614,
"mc2_stderr": 0.016304788824564324
},
"harness|arc:challenge|25": {
"acc": 0.5059726962457338,
"acc_stderr": 0.014610348300255795,
"acc_norm": 0.5409556313993175,
"acc_norm_stderr": 0.01456229107360123
},
"harness|hellaswag|10": {
"acc": 0.5844453296156145,
"acc_stderr": 0.004918102168717934,
"acc_norm": 0.7604062935670185,
"acc_norm_stderr": 0.004259631900173254
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.040633027314866725,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.040633027314866725
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.02977308271331987,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.02977308271331987
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325583,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325583
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.02413015829976262,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.02413015829976262
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.0416345303130286,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.0416345303130286
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.02779187875313227,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.02779187875313227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998575,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164552,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182087,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182087
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501624,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.03256685484460388,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.03256685484460388
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.04587904741301812,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.04587904741301812
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548914,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548914
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7254150702426565,
"acc_stderr": 0.015959829933084032,
"acc_norm": 0.7254150702426565,
"acc_norm_stderr": 0.015959829933084032
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613674,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613674
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.311731843575419,
"acc_stderr": 0.015491756531894637,
"acc_norm": 0.311731843575419,
"acc_norm_stderr": 0.015491756531894637
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.028431095444176643,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.028431095444176643
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5802469135802469,
"acc_stderr": 0.02746009955700513,
"acc_norm": 0.5802469135802469,
"acc_norm_stderr": 0.02746009955700513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40091264667535853,
"acc_stderr": 0.012516960350640823,
"acc_norm": 0.40091264667535853,
"acc_norm_stderr": 0.012516960350640823
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.030290619180485687,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.030290619180485687
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.020227402794434867,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.020227402794434867
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.030555316755573644,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.030555316755573644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.034010526201040885,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.034010526201040885
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.48040816029933614,
"mc2_stderr": 0.016304788824564324
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.01224880696937642
},
"harness|gsm8k|5": {
"acc": 0.20621683093252463,
"acc_stderr": 0.01114436408978142
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Technoculture__Medorca-2x7b | [
"region:us"
] | 2024-01-16T08:20:04+00:00 | {"pretty_name": "Evaluation run of Technoculture/Medorca-2x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Technoculture/Medorca-2x7b](https://huggingface.co/Technoculture/Medorca-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__Medorca-2x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T08:17:44.723678](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medorca-2x7b/blob/main/results_2024-01-16T08-17-44.723678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5314920128709506,\n \"acc_stderr\": 0.03395701268323282,\n \"acc_norm\": 0.5369605553507105,\n \"acc_norm_stderr\": 0.03469962839191263,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.48040816029933614,\n \"mc2_stderr\": 0.016304788824564324\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5059726962457338,\n \"acc_stderr\": 0.014610348300255795,\n \"acc_norm\": 0.5409556313993175,\n \"acc_norm_stderr\": 0.01456229107360123\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5844453296156145,\n \"acc_stderr\": 0.004918102168717934,\n \"acc_norm\": 0.7604062935670185,\n \"acc_norm_stderr\": 0.004259631900173254\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.040633027314866725,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.040633027314866725\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.02977308271331987,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.02977308271331987\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325583,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325583\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.02413015829976262,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.02413015829976262\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.0416345303130286,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.0416345303130286\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6064516129032258,\n \"acc_stderr\": 0.02779187875313227,\n \"acc_norm\": 0.6064516129032258,\n \"acc_norm_stderr\": 0.02779187875313227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998575,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164552,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182087,\n \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182087\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501624,\n \"acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501624\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.03256685484460388,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.03256685484460388\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.04587904741301812,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.04587904741301812\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548914,\n \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548914\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7254150702426565,\n \"acc_stderr\": 0.015959829933084032,\n \"acc_norm\": 0.7254150702426565,\n \"acc_norm_stderr\": 0.015959829933084032\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613674,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613674\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.311731843575419,\n \"acc_stderr\": 0.015491756531894637,\n \"acc_norm\": 0.311731843575419,\n \"acc_norm_stderr\": 0.015491756531894637\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176643,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176643\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.02746009955700513,\n \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.02746009955700513\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40091264667535853,\n \"acc_stderr\": 0.012516960350640823,\n \"acc_norm\": 0.40091264667535853,\n \"acc_norm_stderr\": 0.012516960350640823\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.030290619180485687,\n \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.030290619180485687\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.020227402794434867,\n \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.020227402794434867\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573644,\n \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573644\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.034010526201040885,\n \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.034010526201040885\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.48040816029933614,\n \"mc2_stderr\": 0.016304788824564324\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.01224880696937642\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20621683093252463,\n \"acc_stderr\": 0.01114436408978142\n }\n}\n```", "repo_url": "https://huggingface.co/Technoculture/Medorca-2x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|arc:challenge|25_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|gsm8k|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hellaswag|10_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T08-17-44.723678.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["**/details_harness|winogrande|5_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T08-17-44.723678.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T08_17_44.723678", "path": ["results_2024-01-16T08-17-44.723678.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T08-17-44.723678.parquet"]}]}]} | 2024-01-16T08:20:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Technoculture/Medorca-2x7b
Dataset automatically created during the evaluation run of model Technoculture/Medorca-2x7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T08:17:44.723678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Technoculture/Medorca-2x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Medorca-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T08:17:44.723678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Technoculture/Medorca-2x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Medorca-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T08:17:44.723678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1806b40c6aae812639967b2969d776d9987408ee | # Dataset Card for "myridade_dbg_aligned_ontologie"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/myridade_dbg_aligned_ontologie | [
"region:us"
] | 2024-01-16T08:25:59+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 47868666, "num_examples": 98206}], "download_size": 11334806, "dataset_size": 47868666}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T08:26:02+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "myridade_dbg_aligned_ontologie"
More Information needed | [
"# Dataset Card for \"myridade_dbg_aligned_ontologie\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"myridade_dbg_aligned_ontologie\"\n\nMore Information needed"
] |
f800d33f3bb75c2ab602f88e4dac2a3dd231342f |
# Dataset Card for Evaluation run of Technoculture/Mediquad-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/Mediquad-4x7b](https://huggingface.co/Technoculture/Mediquad-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__Mediquad-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T08:24:28.609699](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Mediquad-4x7b/blob/main/results_2024-01-16T08-24-28.609699.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28390168620155787,
"acc_stderr": 0.031652775276322986,
"acc_norm": 0.2863068103768225,
"acc_norm_stderr": 0.032507978612879115,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.49560252838701496,
"mc2_stderr": 0.016847234757977527
},
"harness|arc:challenge|25": {
"acc": 0.21075085324232082,
"acc_stderr": 0.011918271754852165,
"acc_norm": 0.27474402730375425,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.26687910774746065,
"acc_stderr": 0.004414246720076111,
"acc_norm": 0.28211511651065524,
"acc_norm_stderr": 0.004491093528113431
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111836,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111836
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.02977164271249123,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.02977164271249123
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577657,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.15,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114482,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114482
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29797979797979796,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.29797979797979796,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.02865749128507198,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.02865749128507198
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26422018348623855,
"acc_stderr": 0.01890416417151019,
"acc_norm": 0.26422018348623855,
"acc_norm_stderr": 0.01890416417151019
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.03374499356319355,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.03374499356319355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4388185654008439,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.4388185654008439,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15246636771300448,
"acc_stderr": 0.024126204813252877,
"acc_norm": 0.15246636771300448,
"acc_norm_stderr": 0.024126204813252877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.39316239316239315,
"acc_stderr": 0.03199957924651047,
"acc_norm": 0.39316239316239315,
"acc_norm_stderr": 0.03199957924651047
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455763,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455763
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409153,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409153
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.02600480036395211,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.02600480036395211
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3665594855305466,
"acc_stderr": 0.02736807824397163,
"acc_norm": 0.3665594855305466,
"acc_norm_stderr": 0.02736807824397163
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705477,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2861799217731421,
"acc_stderr": 0.011543642878150755,
"acc_norm": 0.2861799217731421,
"acc_norm_stderr": 0.011543642878150755
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2977941176470588,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.2977941176470588,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.018120224251484587,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.018120224251484587
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3673469387755102,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.3673469387755102,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.42786069651741293,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.42786069651741293,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.18128654970760233,
"acc_stderr": 0.029547741687640024,
"acc_norm": 0.18128654970760233,
"acc_norm_stderr": 0.029547741687640024
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.49560252838701496,
"mc2_stderr": 0.016847234757977527
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.01405174596179052
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Technoculture__Mediquad-4x7b | [
"region:us"
] | 2024-01-16T08:26:46+00:00 | {"pretty_name": "Evaluation run of Technoculture/Mediquad-4x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Technoculture/Mediquad-4x7b](https://huggingface.co/Technoculture/Mediquad-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__Mediquad-4x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T08:24:28.609699](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Mediquad-4x7b/blob/main/results_2024-01-16T08-24-28.609699.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28390168620155787,\n \"acc_stderr\": 0.031652775276322986,\n \"acc_norm\": 0.2863068103768225,\n \"acc_norm_stderr\": 0.032507978612879115,\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.49560252838701496,\n \"mc2_stderr\": 0.016847234757977527\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21075085324232082,\n \"acc_stderr\": 0.011918271754852165,\n \"acc_norm\": 0.27474402730375425,\n \"acc_norm_stderr\": 0.013044617212771227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26687910774746065,\n \"acc_stderr\": 0.004414246720076111,\n \"acc_norm\": 0.28211511651065524,\n \"acc_norm_stderr\": 0.004491093528113431\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111836,\n \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111836\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.02977164271249123,\n \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.02977164271249123\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577657,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577657\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114482,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114482\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.29797979797979796,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\": 0.29797979797979796,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.033248379397581594,\n \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.033248379397581594\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635474,\n \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635474\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.02865749128507198,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.02865749128507198\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26422018348623855,\n \"acc_stderr\": 0.01890416417151019,\n \"acc_norm\": 0.26422018348623855,\n \"acc_norm_stderr\": 0.01890416417151019\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.03374499356319355,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.03374499356319355\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4388185654008439,\n \"acc_stderr\": 0.032302649315470375,\n \"acc_norm\": 0.4388185654008439,\n \"acc_norm_stderr\": 0.032302649315470375\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15246636771300448,\n \"acc_stderr\": 0.024126204813252877,\n \"acc_norm\": 0.15246636771300448,\n \"acc_norm_stderr\": 0.024126204813252877\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.21487603305785125,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.39316239316239315,\n \"acc_stderr\": 0.03199957924651047,\n \"acc_norm\": 0.39316239316239315,\n \"acc_norm_stderr\": 0.03199957924651047\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n \"acc_stderr\": 0.015594955384455763,\n \"acc_norm\": 0.2554278416347382,\n \"acc_norm_stderr\": 0.015594955384455763\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.024027745155265012,\n \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.024027745155265012\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n \"acc_stderr\": 0.014572650383409153,\n \"acc_norm\": 0.2547486033519553,\n \"acc_norm_stderr\": 0.014572650383409153\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.02600480036395211,\n \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.02600480036395211\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3665594855305466,\n \"acc_stderr\": 0.02736807824397163,\n \"acc_norm\": 0.3665594855305466,\n \"acc_norm_stderr\": 0.02736807824397163\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042103,\n \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042103\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705477,\n \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705477\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2861799217731421,\n \"acc_stderr\": 0.011543642878150755,\n \"acc_norm\": 0.2861799217731421,\n \"acc_norm_stderr\": 0.011543642878150755\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2977941176470588,\n \"acc_stderr\": 0.027778298701545443,\n \"acc_norm\": 0.2977941176470588,\n \"acc_norm_stderr\": 0.027778298701545443\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.018120224251484587,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.018120224251484587\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3673469387755102,\n \"acc_stderr\": 0.03086214492108756,\n \"acc_norm\": 0.3673469387755102,\n \"acc_norm_stderr\": 0.03086214492108756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.42786069651741293,\n \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.42786069651741293,\n \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.18128654970760233,\n \"acc_stderr\": 0.029547741687640024,\n \"acc_norm\": 0.18128654970760233,\n \"acc_norm_stderr\": 0.029547741687640024\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.49560252838701496,\n \"mc2_stderr\": 0.016847234757977527\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.01405174596179052\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Technoculture/Mediquad-4x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|arc:challenge|25_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|gsm8k|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hellaswag|10_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T08-24-28.609699.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["**/details_harness|winogrande|5_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T08-24-28.609699.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T08_24_28.609699", "path": ["results_2024-01-16T08-24-28.609699.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T08-24-28.609699.parquet"]}]}]} | 2024-01-16T08:27:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Technoculture/Mediquad-4x7b
Dataset automatically created during the evaluation run of model Technoculture/Mediquad-4x7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T08:24:28.609699(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Technoculture/Mediquad-4x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Mediquad-4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T08:24:28.609699(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Technoculture/Mediquad-4x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Mediquad-4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T08:24:28.609699(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
eefbf0da8ae5a6cc28440244f9787646b0f71adb | # Dataset of proteins created by diffusion models | EvaKlimentova/Diffusion-all_knots | [
"region:us"
] | 2024-01-16T08:45:29+00:00 | {"dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "Sequence", "dtype": "string"}, {"name": "Label", "dtype": "int64"}, {"name": "Tool", "dtype": "string"}, {"name": "f0", "dtype": "float64"}, {"name": "f1", "dtype": "float64"}, {"name": "f2", "dtype": "float64"}, {"name": "f3", "dtype": "float64"}, {"name": "f4", "dtype": "float64"}, {"name": "f5", "dtype": "float64"}, {"name": "f6", "dtype": "float64"}, {"name": "f7", "dtype": "float64"}, {"name": "f8", "dtype": "float64"}, {"name": "f9", "dtype": "float64"}, {"name": "f10", "dtype": "float64"}, {"name": "f11", "dtype": "float64"}, {"name": "f12", "dtype": "float64"}, {"name": "f13", "dtype": "float64"}, {"name": "f14", "dtype": "float64"}, {"name": "f15", "dtype": "float64"}, {"name": "f16", "dtype": "float64"}, {"name": "f17", "dtype": "float64"}, {"name": "f18", "dtype": "float64"}, {"name": "f19", "dtype": "float64"}, {"name": "f20", "dtype": "float64"}, {"name": "f21", "dtype": "float64"}, {"name": "f22", "dtype": "float64"}, {"name": "f23", "dtype": "float64"}, {"name": "f24", "dtype": "float64"}, {"name": "f25", "dtype": "float64"}, {"name": "f26", "dtype": "float64"}, {"name": "f27", "dtype": "float64"}, {"name": "f28", "dtype": "float64"}, {"name": "f29", "dtype": "float64"}, {"name": "f30", "dtype": "float64"}, {"name": "f31", "dtype": "float64"}, {"name": "f32", "dtype": "float64"}, {"name": "f33", "dtype": "float64"}, {"name": "f34", "dtype": "float64"}, {"name": "f35", "dtype": "float64"}, {"name": "f36", "dtype": "float64"}, {"name": "f37", "dtype": "float64"}, {"name": "f38", "dtype": "float64"}, {"name": "f39", "dtype": "float64"}, {"name": "f40", "dtype": "float64"}, {"name": "f41", "dtype": "float64"}, {"name": "f42", "dtype": "float64"}, {"name": "f43", "dtype": "float64"}, {"name": "f44", "dtype": "float64"}, {"name": "f45", "dtype": "float64"}, {"name": "f46", "dtype": "float64"}, {"name": "f47", "dtype": "float64"}, {"name": "f48", "dtype": "float64"}, {"name": "f49", "dtype": "float64"}, {"name": "f50", "dtype": "float64"}, {"name": "f51", "dtype": "float64"}, {"name": "f52", "dtype": "float64"}, {"name": "f53", "dtype": "float64"}, {"name": "f54", "dtype": "float64"}, {"name": "f55", "dtype": "float64"}, {"name": "f56", "dtype": "float64"}, {"name": "f57", "dtype": "float64"}, {"name": "f58", "dtype": "float64"}, {"name": "f59", "dtype": "float64"}, {"name": "f60", "dtype": "float64"}, {"name": "f61", "dtype": "float64"}, {"name": "f62", "dtype": "float64"}, {"name": "f63", "dtype": "float64"}, {"name": "f64", "dtype": "float64"}, {"name": "f65", "dtype": "float64"}, {"name": "f66", "dtype": "float64"}, {"name": "f67", "dtype": "float64"}, {"name": "f68", "dtype": "float64"}, {"name": "f69", "dtype": "float64"}, {"name": "f70", "dtype": "float64"}, {"name": "f71", "dtype": "float64"}, {"name": "f72", "dtype": "float64"}, {"name": "f73", "dtype": "float64"}, {"name": "f74", "dtype": "float64"}, {"name": "f75", "dtype": "float64"}, {"name": "f76", "dtype": "float64"}, {"name": "f77", "dtype": "float64"}, {"name": "f78", "dtype": "float64"}, {"name": "f79", "dtype": "float64"}, {"name": "f80", "dtype": "float64"}, {"name": "f81", "dtype": "float64"}, {"name": "f82", "dtype": "float64"}, {"name": "f83", "dtype": "float64"}, {"name": "f84", "dtype": "float64"}, {"name": "f85", "dtype": "float64"}, {"name": "f86", "dtype": "float64"}, {"name": "f87", "dtype": "float64"}, {"name": "f88", "dtype": "float64"}, {"name": "f89", "dtype": "float64"}, {"name": "f90", "dtype": "float64"}, {"name": "f91", "dtype": "float64"}, {"name": "f92", "dtype": "float64"}, {"name": "f93", "dtype": "float64"}, {"name": "f94", "dtype": "float64"}, {"name": "f95", "dtype": "float64"}, {"name": "f96", "dtype": "float64"}, {"name": "f97", "dtype": "float64"}, {"name": "f98", "dtype": "float64"}, {"name": "f99", "dtype": "float64"}, {"name": "f100", "dtype": "float64"}, {"name": "f101", "dtype": "float64"}, {"name": "f102", "dtype": "float64"}, {"name": "f103", "dtype": "float64"}, {"name": "f104", "dtype": "float64"}, {"name": "f105", "dtype": "float64"}, {"name": "f106", "dtype": "float64"}, {"name": "f107", "dtype": "float64"}, {"name": "f108", "dtype": "float64"}, {"name": "f109", "dtype": "float64"}, {"name": "f110", "dtype": "float64"}, {"name": "f111", "dtype": "float64"}, {"name": "f112", "dtype": "float64"}, {"name": "f113", "dtype": "float64"}, {"name": "f114", "dtype": "float64"}, {"name": "f115", "dtype": "float64"}, {"name": "f116", "dtype": "float64"}, {"name": "f117", "dtype": "float64"}, {"name": "f118", "dtype": "float64"}, {"name": "f119", "dtype": "float64"}, {"name": "f120", "dtype": "float64"}, {"name": "f121", "dtype": "float64"}, {"name": "f122", "dtype": "float64"}, {"name": "f123", "dtype": "float64"}, {"name": "f124", "dtype": "float64"}, {"name": "f125", "dtype": "float64"}, {"name": "f126", "dtype": "float64"}, {"name": "f127", "dtype": "float64"}, {"name": "f128", "dtype": "float64"}, {"name": "f129", "dtype": "float64"}, {"name": "f130", "dtype": "float64"}, {"name": "f131", "dtype": "float64"}, {"name": "f132", "dtype": "float64"}, {"name": "f133", "dtype": "float64"}, {"name": "f134", "dtype": "float64"}, {"name": "f135", "dtype": "float64"}, {"name": "f136", "dtype": "float64"}, {"name": "f137", "dtype": "float64"}, {"name": "f138", "dtype": "float64"}, {"name": "f139", "dtype": "float64"}, {"name": "f140", "dtype": "float64"}, {"name": "f141", "dtype": "float64"}, {"name": "f142", "dtype": "float64"}, {"name": "f143", "dtype": "float64"}, {"name": "f144", "dtype": "float64"}, {"name": "f145", "dtype": "float64"}, {"name": "f146", "dtype": "float64"}, {"name": "f147", "dtype": "float64"}, {"name": "f148", "dtype": "float64"}, {"name": "f149", "dtype": "float64"}, {"name": "f150", "dtype": "float64"}, {"name": "f151", "dtype": "float64"}, {"name": "f152", "dtype": "float64"}, {"name": "f153", "dtype": "float64"}, {"name": "f154", "dtype": "float64"}, {"name": "f155", "dtype": "float64"}, {"name": "f156", "dtype": "float64"}, {"name": "f157", "dtype": "float64"}, {"name": "f158", "dtype": "float64"}, {"name": "f159", "dtype": "float64"}, {"name": "f160", "dtype": "float64"}, {"name": "f161", "dtype": "float64"}, {"name": "f162", "dtype": "float64"}, {"name": "f163", "dtype": "float64"}, {"name": "f164", "dtype": "float64"}, {"name": "f165", "dtype": "float64"}, {"name": "f166", "dtype": "float64"}, {"name": "f167", "dtype": "float64"}, {"name": "f168", "dtype": "float64"}, {"name": "f169", "dtype": "float64"}, {"name": "f170", "dtype": "float64"}, {"name": "f171", "dtype": "float64"}, {"name": "f172", "dtype": "float64"}, {"name": "f173", "dtype": "float64"}, {"name": "f174", "dtype": "float64"}, {"name": "f175", "dtype": "float64"}, {"name": "f176", "dtype": "float64"}, {"name": "f177", "dtype": "float64"}, {"name": "f178", "dtype": "float64"}, {"name": "f179", "dtype": "float64"}, {"name": "f180", "dtype": "float64"}, {"name": "f181", "dtype": "float64"}, {"name": "f182", "dtype": "float64"}, {"name": "f183", "dtype": "float64"}, {"name": "f184", "dtype": "float64"}, {"name": "f185", "dtype": "float64"}, {"name": "f186", "dtype": "float64"}, {"name": "f187", "dtype": "float64"}, {"name": "f188", "dtype": "float64"}, {"name": "f189", "dtype": "float64"}, {"name": "f190", "dtype": "float64"}, {"name": "f191", "dtype": "float64"}, {"name": "f192", "dtype": "float64"}, {"name": "f193", "dtype": "float64"}, {"name": "f194", "dtype": "float64"}, {"name": "f195", "dtype": "float64"}, {"name": "f196", "dtype": "float64"}, {"name": "f197", "dtype": "float64"}, {"name": "f198", "dtype": "float64"}, {"name": "f199", "dtype": "float64"}, {"name": "f200", "dtype": "float64"}, {"name": "f201", "dtype": "float64"}, {"name": "f202", "dtype": "float64"}, {"name": "f203", "dtype": "float64"}, {"name": "f204", "dtype": "float64"}, {"name": "f205", "dtype": "float64"}, {"name": "f206", "dtype": "float64"}, {"name": "f207", "dtype": "float64"}, {"name": "f208", "dtype": "float64"}, {"name": "f209", "dtype": "float64"}, {"name": "f210", "dtype": "float64"}, {"name": "f211", "dtype": "float64"}, {"name": "f212", "dtype": "float64"}, {"name": "f213", "dtype": "float64"}, {"name": "f214", "dtype": "float64"}, {"name": "f215", "dtype": "float64"}, {"name": "f216", "dtype": "float64"}, {"name": "f217", "dtype": "float64"}, {"name": "f218", "dtype": "float64"}, {"name": "f219", "dtype": "float64"}, {"name": "f220", "dtype": "float64"}, {"name": "f221", "dtype": "float64"}, {"name": "f222", "dtype": "float64"}, {"name": "f223", "dtype": "float64"}, {"name": "f224", "dtype": "float64"}, {"name": "f225", "dtype": "float64"}, {"name": "f226", "dtype": "float64"}, {"name": "f227", "dtype": "float64"}, {"name": "f228", "dtype": "float64"}, {"name": "f229", "dtype": "float64"}, {"name": "f230", "dtype": "float64"}, {"name": "f231", "dtype": "float64"}, {"name": "f232", "dtype": "float64"}, {"name": "f233", "dtype": "float64"}, {"name": "f234", "dtype": "float64"}, {"name": "f235", "dtype": "float64"}, {"name": "f236", "dtype": "float64"}, {"name": "f237", "dtype": "float64"}, {"name": "f238", "dtype": "float64"}, {"name": "f239", "dtype": "float64"}, {"name": "f240", "dtype": "float64"}, {"name": "f241", "dtype": "float64"}, {"name": "f242", "dtype": "float64"}, {"name": "f243", "dtype": "float64"}, {"name": "f244", "dtype": "float64"}, {"name": "f245", "dtype": "float64"}, {"name": "f246", "dtype": "float64"}, {"name": "f247", "dtype": "float64"}, {"name": "f248", "dtype": "float64"}, {"name": "f249", "dtype": "float64"}, {"name": "f250", "dtype": "float64"}, {"name": "f251", "dtype": "float64"}, {"name": "f252", "dtype": "float64"}, {"name": "f253", "dtype": "float64"}, {"name": "f254", "dtype": "float64"}, {"name": "f255", "dtype": "float64"}, {"name": "f256", "dtype": "float64"}, {"name": "f257", "dtype": "float64"}, {"name": "f258", "dtype": "float64"}, {"name": "f259", "dtype": "float64"}, {"name": "f260", "dtype": "float64"}, {"name": "f261", "dtype": "float64"}, {"name": "f262", "dtype": "float64"}, {"name": "f263", "dtype": "float64"}, {"name": "f264", "dtype": "float64"}, {"name": "f265", "dtype": "float64"}, {"name": "f266", "dtype": "float64"}, {"name": "f267", "dtype": "float64"}, {"name": "f268", "dtype": "float64"}, {"name": "f269", "dtype": "float64"}, {"name": "f270", "dtype": "float64"}, {"name": "f271", "dtype": "float64"}, {"name": "f272", "dtype": "float64"}, {"name": "f273", "dtype": "float64"}, {"name": "f274", "dtype": "float64"}, {"name": "f275", "dtype": "float64"}, {"name": "f276", "dtype": "float64"}, {"name": "f277", "dtype": "float64"}, {"name": "f278", "dtype": "float64"}, {"name": "f279", "dtype": "float64"}, {"name": "f280", "dtype": "float64"}, {"name": "f281", "dtype": "float64"}, {"name": "f282", "dtype": "float64"}, {"name": "f283", "dtype": "float64"}, {"name": "f284", "dtype": "float64"}, {"name": "f285", "dtype": "float64"}, {"name": "f286", "dtype": "float64"}, {"name": "f287", "dtype": "float64"}, {"name": "f288", "dtype": "float64"}, {"name": "f289", "dtype": "float64"}, {"name": "f290", "dtype": "float64"}, {"name": "f291", "dtype": "float64"}, {"name": "f292", "dtype": "float64"}, {"name": "f293", "dtype": "float64"}, {"name": "f294", "dtype": "float64"}, {"name": "f295", "dtype": "float64"}, {"name": "f296", "dtype": "float64"}, {"name": "f297", "dtype": "float64"}, {"name": "f298", "dtype": "float64"}, {"name": "f299", "dtype": "float64"}, {"name": "f300", "dtype": "float64"}, {"name": "f301", "dtype": "float64"}, {"name": "f302", "dtype": "float64"}, {"name": "f303", "dtype": "float64"}, {"name": "f304", "dtype": "float64"}, {"name": "f305", "dtype": "float64"}, {"name": "f306", "dtype": "float64"}, {"name": "f307", "dtype": "float64"}, {"name": "f308", "dtype": "float64"}, {"name": "f309", "dtype": "float64"}, {"name": "f310", "dtype": "float64"}, {"name": "f311", "dtype": "float64"}, {"name": "f312", "dtype": "float64"}, {"name": "f313", "dtype": "float64"}, {"name": "f314", "dtype": "float64"}, {"name": "f315", "dtype": "float64"}, {"name": "f316", "dtype": "float64"}, {"name": "f317", "dtype": "float64"}, {"name": "f318", "dtype": "float64"}, {"name": "f319", "dtype": "float64"}, {"name": "f320", "dtype": "float64"}, {"name": "f321", "dtype": "float64"}, {"name": "f322", "dtype": "float64"}, {"name": "f323", "dtype": "float64"}, {"name": "f324", "dtype": "float64"}, {"name": "f325", "dtype": "float64"}, {"name": "f326", "dtype": "float64"}, {"name": "f327", "dtype": "float64"}, {"name": "f328", "dtype": "float64"}, {"name": "f329", "dtype": "float64"}, {"name": "f330", "dtype": "float64"}, {"name": "f331", "dtype": "float64"}, {"name": "f332", "dtype": "float64"}, {"name": "f333", "dtype": "float64"}, {"name": "f334", "dtype": "float64"}, {"name": "f335", "dtype": "float64"}, {"name": "f336", "dtype": "float64"}, {"name": "f337", "dtype": "float64"}, {"name": "f338", "dtype": "float64"}, {"name": "f339", "dtype": "float64"}, {"name": "f340", "dtype": "float64"}, {"name": "f341", "dtype": "float64"}, {"name": "f342", "dtype": "float64"}, {"name": "f343", "dtype": "float64"}, {"name": "f344", "dtype": "float64"}, {"name": "f345", "dtype": "float64"}, {"name": "f346", "dtype": "float64"}, {"name": "f347", "dtype": "float64"}, {"name": "f348", "dtype": "float64"}, {"name": "f349", "dtype": "float64"}, {"name": "f350", "dtype": "float64"}, {"name": "f351", "dtype": "float64"}, {"name": "f352", "dtype": "float64"}, {"name": "f353", "dtype": "float64"}, {"name": "f354", "dtype": "float64"}, {"name": "f355", "dtype": "float64"}, {"name": "f356", "dtype": "float64"}, {"name": "f357", "dtype": "float64"}, {"name": "f358", "dtype": "float64"}, {"name": "f359", "dtype": "float64"}, {"name": "f360", "dtype": "float64"}, {"name": "f361", "dtype": "float64"}, {"name": "f362", "dtype": "float64"}, {"name": "f363", "dtype": "float64"}, {"name": "f364", "dtype": "float64"}, {"name": "f365", "dtype": "float64"}, {"name": "f366", "dtype": "float64"}, {"name": "f367", "dtype": "float64"}, {"name": "f368", "dtype": "float64"}, {"name": "f369", "dtype": "float64"}, {"name": "f370", "dtype": "float64"}, {"name": "f371", "dtype": "float64"}, {"name": "f372", "dtype": "float64"}, {"name": "f373", "dtype": "float64"}, {"name": "f374", "dtype": "float64"}, {"name": "f375", "dtype": "float64"}, {"name": "f376", "dtype": "float64"}, {"name": "f377", "dtype": "float64"}, {"name": "f378", "dtype": "float64"}, {"name": "f379", "dtype": "float64"}, {"name": "f380", "dtype": "float64"}, {"name": "f381", "dtype": "float64"}, {"name": "f382", "dtype": "float64"}, {"name": "f383", "dtype": "float64"}, {"name": "f384", "dtype": "float64"}, {"name": "f385", "dtype": "float64"}, {"name": "f386", "dtype": "float64"}, {"name": "f387", "dtype": "float64"}, {"name": "f388", "dtype": "float64"}, {"name": "f389", "dtype": "float64"}, {"name": "f390", "dtype": "float64"}, {"name": "f391", "dtype": "float64"}, {"name": "f392", "dtype": "float64"}, {"name": "f393", "dtype": "float64"}, {"name": "f394", "dtype": "float64"}, {"name": "f395", "dtype": "float64"}, {"name": "f396", "dtype": "float64"}, {"name": "f397", "dtype": "float64"}, {"name": "f398", "dtype": "float64"}, {"name": "f399", "dtype": "float64"}, {"name": "f400", "dtype": "float64"}, {"name": "f401", "dtype": "float64"}, {"name": "f402", "dtype": "float64"}, {"name": "f403", "dtype": "float64"}, {"name": "f404", "dtype": "float64"}, {"name": "f405", "dtype": "float64"}, {"name": "f406", "dtype": "float64"}, {"name": "f407", "dtype": "float64"}, {"name": "f408", "dtype": "float64"}, {"name": "f409", "dtype": "float64"}, {"name": "f410", "dtype": "float64"}, {"name": "f411", "dtype": "float64"}, {"name": "f412", "dtype": "float64"}, {"name": "f413", "dtype": "float64"}, {"name": "f414", "dtype": "float64"}, {"name": "f415", "dtype": "float64"}, {"name": "f416", "dtype": "float64"}, {"name": "f417", "dtype": "float64"}, {"name": "f418", "dtype": "float64"}, {"name": "f419", "dtype": "float64"}, {"name": "f420", "dtype": "float64"}, {"name": "f421", "dtype": "float64"}, {"name": "f422", "dtype": "float64"}, {"name": "f423", "dtype": "float64"}, {"name": "f424", "dtype": "float64"}, {"name": "f425", "dtype": "float64"}, {"name": "f426", "dtype": "float64"}, {"name": "f427", "dtype": "float64"}, {"name": "f428", "dtype": "float64"}, {"name": "f429", "dtype": "float64"}, {"name": "f430", "dtype": "float64"}, {"name": "f431", "dtype": "float64"}, {"name": "f432", "dtype": "float64"}, {"name": "f433", "dtype": "float64"}, {"name": "f434", "dtype": "float64"}, {"name": "f435", "dtype": "float64"}, {"name": "f436", "dtype": "float64"}, {"name": "f437", "dtype": "float64"}, {"name": "f438", "dtype": "float64"}, {"name": "f439", "dtype": "float64"}, {"name": "f440", "dtype": "float64"}, {"name": "f441", "dtype": "float64"}, {"name": "f442", "dtype": "float64"}, {"name": "f443", "dtype": "float64"}, {"name": "f444", "dtype": "float64"}, {"name": "f445", "dtype": "float64"}, {"name": "f446", "dtype": "float64"}, {"name": "f447", "dtype": "float64"}, {"name": "f448", "dtype": "float64"}, {"name": "f449", "dtype": "float64"}, {"name": "f450", "dtype": "float64"}, {"name": "f451", "dtype": "float64"}, {"name": "f452", "dtype": "float64"}, {"name": "f453", "dtype": "float64"}, {"name": "f454", "dtype": "float64"}, {"name": "f455", "dtype": "float64"}, {"name": "f456", "dtype": "float64"}, {"name": "f457", "dtype": "float64"}, {"name": "f458", "dtype": "float64"}, {"name": "f459", "dtype": "float64"}, {"name": "f460", "dtype": "float64"}, {"name": "f461", "dtype": "float64"}, {"name": "f462", "dtype": "float64"}, {"name": "f463", "dtype": "float64"}, {"name": "f464", "dtype": "float64"}, {"name": "f465", "dtype": "float64"}, {"name": "f466", "dtype": "float64"}, {"name": "f467", "dtype": "float64"}, {"name": "f468", "dtype": "float64"}, {"name": "f469", "dtype": "float64"}, {"name": "f470", "dtype": "float64"}, {"name": "f471", "dtype": "float64"}, {"name": "f472", "dtype": "float64"}, {"name": "f473", "dtype": "float64"}, {"name": "f474", "dtype": "float64"}, {"name": "f475", "dtype": "float64"}, {"name": "f476", "dtype": "float64"}, {"name": "f477", "dtype": "float64"}, {"name": "f478", "dtype": "float64"}, {"name": "f479", "dtype": "float64"}, {"name": "f480", "dtype": "float64"}, {"name": "f481", "dtype": "float64"}, {"name": "f482", "dtype": "float64"}, {"name": "f483", "dtype": "float64"}, {"name": "f484", "dtype": "float64"}, {"name": "f485", "dtype": "float64"}, {"name": "f486", "dtype": "float64"}, {"name": "f487", "dtype": "float64"}, {"name": "f488", "dtype": "float64"}, {"name": "f489", "dtype": "float64"}, {"name": "f490", "dtype": "float64"}, {"name": "f491", "dtype": "float64"}, {"name": "f492", "dtype": "float64"}, {"name": "f493", "dtype": "float64"}, {"name": "f494", "dtype": "float64"}, {"name": "f495", "dtype": "float64"}, {"name": "f496", "dtype": "float64"}, {"name": "f497", "dtype": "float64"}, {"name": "f498", "dtype": "float64"}, {"name": "f499", "dtype": "float64"}, {"name": "f500", "dtype": "float64"}, {"name": "f501", "dtype": "float64"}, {"name": "f502", "dtype": "float64"}, {"name": "f503", "dtype": "float64"}, {"name": "f504", "dtype": "float64"}, {"name": "f505", "dtype": "float64"}, {"name": "f506", "dtype": "float64"}, {"name": "f507", "dtype": "float64"}, {"name": "f508", "dtype": "float64"}, {"name": "f509", "dtype": "float64"}, {"name": "f510", "dtype": "float64"}, {"name": "f511", "dtype": "float64"}, {"name": "f512", "dtype": "float64"}, {"name": "f513", "dtype": "float64"}, {"name": "f514", "dtype": "float64"}, {"name": "f515", "dtype": "float64"}, {"name": "f516", "dtype": "float64"}, {"name": "f517", "dtype": "float64"}, {"name": "f518", "dtype": "float64"}, {"name": "f519", "dtype": "float64"}, {"name": "f520", "dtype": "float64"}, {"name": "f521", "dtype": "float64"}, {"name": "f522", "dtype": "float64"}, {"name": "f523", "dtype": "float64"}, {"name": "f524", "dtype": "float64"}, {"name": "f525", "dtype": "float64"}, {"name": "f526", "dtype": "float64"}, {"name": "f527", "dtype": "float64"}, {"name": "f528", "dtype": "float64"}, {"name": "f529", "dtype": "float64"}, {"name": "f530", "dtype": "float64"}, {"name": "f531", "dtype": "float64"}, {"name": "f532", "dtype": "float64"}, {"name": "f533", "dtype": "float64"}, {"name": "f534", "dtype": "float64"}, {"name": "f535", "dtype": "float64"}, {"name": "f536", "dtype": "float64"}, {"name": "f537", "dtype": "float64"}, {"name": "f538", "dtype": "float64"}, {"name": "f539", "dtype": "float64"}, {"name": "f540", "dtype": "float64"}, {"name": "f541", "dtype": "float64"}, {"name": "f542", "dtype": "float64"}, {"name": "f543", "dtype": "float64"}, {"name": "f544", "dtype": "float64"}, {"name": "f545", "dtype": "float64"}, {"name": "f546", "dtype": "float64"}, {"name": "f547", "dtype": "float64"}, {"name": "f548", "dtype": "float64"}, {"name": "f549", "dtype": "float64"}, {"name": "f550", "dtype": "float64"}, {"name": "f551", "dtype": "float64"}, {"name": "f552", "dtype": "float64"}, {"name": "f553", "dtype": "float64"}, {"name": "f554", "dtype": "float64"}, {"name": "f555", "dtype": "float64"}, {"name": "f556", "dtype": "float64"}, {"name": "f557", "dtype": "float64"}, {"name": "f558", "dtype": "float64"}, {"name": "f559", "dtype": "float64"}, {"name": "f560", "dtype": "float64"}, {"name": "f561", "dtype": "float64"}, {"name": "f562", "dtype": "float64"}, {"name": "f563", "dtype": "float64"}, {"name": "f564", "dtype": "float64"}, {"name": "f565", "dtype": "float64"}, {"name": "f566", "dtype": "float64"}, {"name": "f567", "dtype": "float64"}, {"name": "f568", "dtype": "float64"}, {"name": "f569", "dtype": "float64"}, {"name": "f570", "dtype": "float64"}, {"name": "f571", "dtype": "float64"}, {"name": "f572", "dtype": "float64"}, {"name": "f573", "dtype": "float64"}, {"name": "f574", "dtype": "float64"}, {"name": "f575", "dtype": "float64"}, {"name": "f576", "dtype": "float64"}, {"name": "f577", "dtype": "float64"}, {"name": "f578", "dtype": "float64"}, {"name": "f579", "dtype": "float64"}, {"name": "f580", "dtype": "float64"}, {"name": "f581", "dtype": "float64"}, {"name": "f582", "dtype": "float64"}, {"name": "f583", "dtype": "float64"}, {"name": "f584", "dtype": "float64"}, {"name": "f585", "dtype": "float64"}, {"name": "f586", "dtype": "float64"}, {"name": "f587", "dtype": "float64"}, {"name": "f588", "dtype": "float64"}, {"name": "f589", "dtype": "float64"}, {"name": "f590", "dtype": "float64"}, {"name": "f591", "dtype": "float64"}, {"name": "f592", "dtype": "float64"}, {"name": "f593", "dtype": "float64"}, {"name": "f594", "dtype": "float64"}, {"name": "f595", "dtype": "float64"}, {"name": "f596", "dtype": "float64"}, {"name": "f597", "dtype": "float64"}, {"name": "f598", "dtype": "float64"}, {"name": "f599", "dtype": "float64"}, {"name": "f600", "dtype": "float64"}, {"name": "f601", "dtype": "float64"}, {"name": "f602", "dtype": "float64"}, {"name": "f603", "dtype": "float64"}, {"name": "f604", "dtype": "float64"}, {"name": "f605", "dtype": "float64"}, {"name": "f606", "dtype": "float64"}, {"name": "f607", "dtype": "float64"}, {"name": "f608", "dtype": "float64"}, {"name": "f609", "dtype": "float64"}, {"name": "f610", "dtype": "float64"}, {"name": "f611", "dtype": "float64"}, {"name": "f612", "dtype": "float64"}, {"name": "f613", "dtype": "float64"}, {"name": "f614", "dtype": "float64"}, {"name": "f615", "dtype": "float64"}, {"name": "f616", "dtype": "float64"}, {"name": "f617", "dtype": "float64"}, {"name": "f618", "dtype": "float64"}, {"name": "f619", "dtype": "float64"}, {"name": "f620", "dtype": "float64"}, {"name": "f621", "dtype": "float64"}, {"name": "f622", "dtype": "float64"}, {"name": "f623", "dtype": "float64"}, {"name": "f624", "dtype": "float64"}, {"name": "f625", "dtype": "float64"}, {"name": "f626", "dtype": "float64"}, {"name": "f627", "dtype": "float64"}, {"name": "f628", "dtype": "float64"}, {"name": "f629", "dtype": "float64"}, {"name": "f630", "dtype": "float64"}, {"name": "f631", "dtype": "float64"}, {"name": "f632", "dtype": "float64"}, {"name": "f633", "dtype": "float64"}, {"name": "f634", "dtype": "float64"}, {"name": "f635", "dtype": "float64"}, {"name": "f636", "dtype": "float64"}, {"name": "f637", "dtype": "float64"}, {"name": "f638", "dtype": "float64"}, {"name": "f639", "dtype": "float64"}, {"name": "f640", "dtype": "float64"}, {"name": "f641", "dtype": "float64"}, {"name": "f642", "dtype": "float64"}, {"name": "f643", "dtype": "float64"}, {"name": "f644", "dtype": "float64"}, {"name": "f645", "dtype": "float64"}, {"name": "f646", "dtype": "float64"}, {"name": "f647", "dtype": "float64"}, {"name": "f648", "dtype": "float64"}, {"name": "f649", "dtype": "float64"}, {"name": "f650", "dtype": "float64"}, {"name": "f651", "dtype": "float64"}, {"name": "f652", "dtype": "float64"}, {"name": "f653", "dtype": "float64"}, {"name": "f654", "dtype": "float64"}, {"name": "f655", "dtype": "float64"}, {"name": "f656", "dtype": "float64"}, {"name": "f657", "dtype": "float64"}, {"name": "f658", "dtype": "float64"}, {"name": "f659", "dtype": "float64"}, {"name": "f660", "dtype": "float64"}, {"name": "f661", "dtype": "float64"}, {"name": "f662", "dtype": "float64"}, {"name": "f663", "dtype": "float64"}, {"name": "f664", "dtype": "float64"}, {"name": "f665", "dtype": "float64"}, {"name": "f666", "dtype": "float64"}, {"name": "f667", "dtype": "float64"}, {"name": "f668", "dtype": "float64"}, {"name": "f669", "dtype": "float64"}, {"name": "f670", "dtype": "float64"}, {"name": "f671", "dtype": "float64"}, {"name": "f672", "dtype": "float64"}, {"name": "f673", "dtype": "float64"}, {"name": "f674", "dtype": "float64"}, {"name": "f675", "dtype": "float64"}, {"name": "f676", "dtype": "float64"}, {"name": "f677", "dtype": "float64"}, {"name": "f678", "dtype": "float64"}, {"name": "f679", "dtype": "float64"}, {"name": "f680", "dtype": "float64"}, {"name": "f681", "dtype": "float64"}, {"name": "f682", "dtype": "float64"}, {"name": "f683", "dtype": "float64"}, {"name": "f684", "dtype": "float64"}, {"name": "f685", "dtype": "float64"}, {"name": "f686", "dtype": "float64"}, {"name": "f687", "dtype": "float64"}, {"name": "f688", "dtype": "float64"}, {"name": "f689", "dtype": "float64"}, {"name": "f690", "dtype": "float64"}, {"name": "f691", "dtype": "float64"}, {"name": "f692", "dtype": "float64"}, {"name": "f693", "dtype": "float64"}, {"name": "f694", "dtype": "float64"}, {"name": "f695", "dtype": "float64"}, {"name": "f696", "dtype": "float64"}, {"name": "f697", "dtype": "float64"}, {"name": "f698", "dtype": "float64"}, {"name": "f699", "dtype": "float64"}, {"name": "f700", "dtype": "float64"}, {"name": "f701", "dtype": "float64"}, {"name": "f702", "dtype": "float64"}, {"name": "f703", "dtype": "float64"}, {"name": "f704", "dtype": "float64"}, {"name": "f705", "dtype": "float64"}, {"name": "f706", "dtype": "float64"}, {"name": "f707", "dtype": "float64"}, {"name": "f708", "dtype": "float64"}, {"name": "f709", "dtype": "float64"}, {"name": "f710", "dtype": "float64"}, {"name": "f711", "dtype": "float64"}, {"name": "f712", "dtype": "float64"}, {"name": "f713", "dtype": "float64"}, {"name": "f714", "dtype": "float64"}, {"name": "f715", "dtype": "float64"}, {"name": "f716", "dtype": "float64"}, {"name": "f717", "dtype": "float64"}, {"name": "f718", "dtype": "float64"}, {"name": "f719", "dtype": "float64"}, {"name": "f720", "dtype": "float64"}, {"name": "f721", "dtype": "float64"}, {"name": "f722", "dtype": "float64"}, {"name": "f723", "dtype": "float64"}, {"name": "f724", "dtype": "float64"}, {"name": "f725", "dtype": "float64"}, {"name": "f726", "dtype": "float64"}, {"name": "f727", "dtype": "float64"}, {"name": "f728", "dtype": "float64"}, {"name": "f729", "dtype": "float64"}, {"name": "f730", "dtype": "float64"}, {"name": "f731", "dtype": "float64"}, {"name": "f732", "dtype": "float64"}, {"name": "f733", "dtype": "float64"}, {"name": "f734", "dtype": "float64"}, {"name": "f735", "dtype": "float64"}, {"name": "f736", "dtype": "float64"}, {"name": "f737", "dtype": "float64"}, {"name": "f738", "dtype": "float64"}, {"name": "f739", "dtype": "float64"}, {"name": "f740", "dtype": "float64"}, {"name": "f741", "dtype": "float64"}, {"name": "f742", "dtype": "float64"}, {"name": "f743", "dtype": "float64"}, {"name": "f744", "dtype": "float64"}, {"name": "f745", "dtype": "float64"}, {"name": "f746", "dtype": "float64"}, {"name": "f747", "dtype": "float64"}, {"name": "f748", "dtype": "float64"}, {"name": "f749", "dtype": "float64"}, {"name": "f750", "dtype": "float64"}, {"name": "f751", "dtype": "float64"}, {"name": "f752", "dtype": "float64"}, {"name": "f753", "dtype": "float64"}, {"name": "f754", "dtype": "float64"}, {"name": "f755", "dtype": "float64"}, {"name": "f756", "dtype": "float64"}, {"name": "f757", "dtype": "float64"}, {"name": "f758", "dtype": "float64"}, {"name": "f759", "dtype": "float64"}, {"name": "f760", "dtype": "float64"}, {"name": "f761", "dtype": "float64"}, {"name": "f762", "dtype": "float64"}, {"name": "f763", "dtype": "float64"}, {"name": "f764", "dtype": "float64"}, {"name": "f765", "dtype": "float64"}, {"name": "f766", "dtype": "float64"}, {"name": "f767", "dtype": "float64"}, {"name": "f768", "dtype": "float64"}, {"name": "f769", "dtype": "float64"}, {"name": "f770", "dtype": "float64"}, {"name": "f771", "dtype": "float64"}, {"name": "f772", "dtype": "float64"}, {"name": "f773", "dtype": "float64"}, {"name": "f774", "dtype": "float64"}, {"name": "f775", "dtype": "float64"}, {"name": "f776", "dtype": "float64"}, {"name": "f777", "dtype": "float64"}, {"name": "f778", "dtype": "float64"}, {"name": "f779", "dtype": "float64"}, {"name": "f780", "dtype": "float64"}, {"name": "f781", "dtype": "float64"}, {"name": "f782", "dtype": "float64"}, {"name": "f783", "dtype": "float64"}, {"name": "f784", "dtype": "float64"}, {"name": "f785", "dtype": "float64"}, {"name": "f786", "dtype": "float64"}, {"name": "f787", "dtype": "float64"}, {"name": "f788", "dtype": "float64"}, {"name": "f789", "dtype": "float64"}, {"name": "f790", "dtype": "float64"}, {"name": "f791", "dtype": "float64"}, {"name": "f792", "dtype": "float64"}, {"name": "f793", "dtype": "float64"}, {"name": "f794", "dtype": "float64"}, {"name": "f795", "dtype": "float64"}, {"name": "f796", "dtype": "float64"}, {"name": "f797", "dtype": "float64"}, {"name": "f798", "dtype": "float64"}, {"name": "f799", "dtype": "float64"}, {"name": "f800", "dtype": "float64"}, {"name": "f801", "dtype": "float64"}, {"name": "f802", "dtype": "float64"}, {"name": "f803", "dtype": "float64"}, {"name": "f804", "dtype": "float64"}, {"name": "f805", "dtype": "float64"}, {"name": "f806", "dtype": "float64"}, {"name": "f807", "dtype": "float64"}, {"name": "f808", "dtype": "float64"}, {"name": "f809", "dtype": "float64"}, {"name": "f810", "dtype": "float64"}, {"name": "f811", "dtype": "float64"}, {"name": "f812", "dtype": "float64"}, {"name": "f813", "dtype": "float64"}, {"name": "f814", "dtype": "float64"}, {"name": "f815", "dtype": "float64"}, {"name": "f816", "dtype": "float64"}, {"name": "f817", "dtype": "float64"}, {"name": "f818", "dtype": "float64"}, {"name": "f819", "dtype": "float64"}, {"name": "f820", "dtype": "float64"}, {"name": "f821", "dtype": "float64"}, {"name": "f822", "dtype": "float64"}, {"name": "f823", "dtype": "float64"}, {"name": "f824", "dtype": "float64"}, {"name": "f825", "dtype": "float64"}, {"name": "f826", "dtype": "float64"}, {"name": "f827", "dtype": "float64"}, {"name": "f828", "dtype": "float64"}, {"name": "f829", "dtype": "float64"}, {"name": "f830", "dtype": "float64"}, {"name": "f831", "dtype": "float64"}, {"name": "f832", "dtype": "float64"}, {"name": "f833", "dtype": "float64"}, {"name": "f834", "dtype": "float64"}, {"name": "f835", "dtype": "float64"}, {"name": "f836", "dtype": "float64"}, {"name": "f837", "dtype": "float64"}, {"name": "f838", "dtype": "float64"}, {"name": "f839", "dtype": "float64"}, {"name": "f840", "dtype": "float64"}, {"name": "f841", "dtype": "float64"}, {"name": "f842", "dtype": "float64"}, {"name": "f843", "dtype": "float64"}, {"name": "f844", "dtype": "float64"}, {"name": "f845", "dtype": "float64"}, {"name": "f846", "dtype": "float64"}, {"name": "f847", "dtype": "float64"}, {"name": "f848", "dtype": "float64"}, {"name": "f849", "dtype": "float64"}, {"name": "f850", "dtype": "float64"}, {"name": "f851", "dtype": "float64"}, {"name": "f852", "dtype": "float64"}, {"name": "f853", "dtype": "float64"}, {"name": "f854", "dtype": "float64"}, {"name": "f855", "dtype": "float64"}, {"name": "f856", "dtype": "float64"}, {"name": "f857", "dtype": "float64"}, {"name": "f858", "dtype": "float64"}, {"name": "f859", "dtype": "float64"}, {"name": "f860", "dtype": "float64"}, {"name": "f861", "dtype": "float64"}, {"name": "f862", "dtype": "float64"}, {"name": "f863", "dtype": "float64"}, {"name": "f864", "dtype": "float64"}, {"name": "f865", "dtype": "float64"}, {"name": "f866", "dtype": "float64"}, {"name": "f867", "dtype": "float64"}, {"name": "f868", "dtype": "float64"}, {"name": "f869", "dtype": "float64"}, {"name": "f870", "dtype": "float64"}, {"name": "f871", "dtype": "float64"}, {"name": "f872", "dtype": "float64"}, {"name": "f873", "dtype": "float64"}, {"name": "f874", "dtype": "float64"}, {"name": "f875", "dtype": "float64"}, {"name": "f876", "dtype": "float64"}, {"name": "f877", "dtype": "float64"}, {"name": "f878", "dtype": "float64"}, {"name": "f879", "dtype": "float64"}, {"name": "f880", "dtype": "float64"}, {"name": "f881", "dtype": "float64"}, {"name": "f882", "dtype": "float64"}, {"name": "f883", "dtype": "float64"}, {"name": "f884", "dtype": "float64"}, {"name": "f885", "dtype": "float64"}, {"name": "f886", "dtype": "float64"}, {"name": "f887", "dtype": "float64"}, {"name": "f888", "dtype": "float64"}, {"name": "f889", "dtype": "float64"}, {"name": "f890", "dtype": "float64"}, {"name": "f891", "dtype": "float64"}, {"name": "f892", "dtype": "float64"}, {"name": "f893", "dtype": "float64"}, {"name": "f894", "dtype": "float64"}, {"name": "f895", "dtype": "float64"}, {"name": "f896", "dtype": "float64"}, {"name": "f897", "dtype": "float64"}, {"name": "f898", "dtype": "float64"}, {"name": "f899", "dtype": "float64"}, {"name": "f900", "dtype": "float64"}, {"name": "f901", "dtype": "float64"}, {"name": "f902", "dtype": "float64"}, {"name": "f903", "dtype": "float64"}, {"name": "f904", "dtype": "float64"}, {"name": "f905", "dtype": "float64"}, {"name": "f906", "dtype": "float64"}, {"name": "f907", "dtype": "float64"}, {"name": "f908", "dtype": "float64"}, {"name": "f909", "dtype": "float64"}, {"name": "f910", "dtype": "float64"}, {"name": "f911", "dtype": "float64"}, {"name": "f912", "dtype": "float64"}, {"name": "f913", "dtype": "float64"}, {"name": "f914", "dtype": "float64"}, {"name": "f915", "dtype": "float64"}, {"name": "f916", "dtype": "float64"}, {"name": "f917", "dtype": "float64"}, {"name": "f918", "dtype": "float64"}, {"name": "f919", "dtype": "float64"}, {"name": "f920", "dtype": "float64"}, {"name": "f921", "dtype": "float64"}, {"name": "f922", "dtype": "float64"}, {"name": "f923", "dtype": "float64"}, {"name": "f924", "dtype": "float64"}, {"name": "f925", "dtype": "float64"}, {"name": "f926", "dtype": "float64"}, {"name": "f927", "dtype": "float64"}, {"name": "f928", "dtype": "float64"}, {"name": "f929", "dtype": "float64"}, {"name": "f930", "dtype": "float64"}, {"name": "f931", "dtype": "float64"}, {"name": "f932", "dtype": "float64"}, {"name": "f933", "dtype": "float64"}, {"name": "f934", "dtype": "float64"}, {"name": "f935", "dtype": "float64"}, {"name": "f936", "dtype": "float64"}, {"name": "f937", "dtype": "float64"}, {"name": "f938", "dtype": "float64"}, {"name": "f939", "dtype": "float64"}, {"name": "f940", "dtype": "float64"}, {"name": "f941", "dtype": "float64"}, {"name": "f942", "dtype": "float64"}, {"name": "f943", "dtype": "float64"}, {"name": "f944", "dtype": "float64"}, {"name": "f945", "dtype": "float64"}, {"name": "f946", "dtype": "float64"}, {"name": "f947", "dtype": "float64"}, {"name": "f948", "dtype": "float64"}, {"name": "f949", "dtype": "float64"}, {"name": "f950", "dtype": "float64"}, {"name": "f951", "dtype": "float64"}, {"name": "f952", "dtype": "float64"}, {"name": "f953", "dtype": "float64"}, {"name": "f954", "dtype": "float64"}, {"name": "f955", "dtype": "float64"}, {"name": "f956", "dtype": "float64"}, {"name": "f957", "dtype": "float64"}, {"name": "f958", "dtype": "float64"}, {"name": "f959", "dtype": "float64"}, {"name": "f960", "dtype": "float64"}, {"name": "f961", "dtype": "float64"}, {"name": "f962", "dtype": "float64"}, {"name": "f963", "dtype": "float64"}, {"name": "f964", "dtype": "float64"}, {"name": "f965", "dtype": "float64"}, {"name": "f966", "dtype": "float64"}, {"name": "f967", "dtype": "float64"}, {"name": "f968", "dtype": "float64"}, {"name": "f969", "dtype": "float64"}, {"name": "f970", "dtype": "float64"}, {"name": "f971", "dtype": "float64"}, {"name": "f972", "dtype": "float64"}, {"name": "f973", "dtype": "float64"}, {"name": "f974", "dtype": "float64"}, {"name": "f975", "dtype": "float64"}, {"name": "f976", "dtype": "float64"}, {"name": "f977", "dtype": "float64"}, {"name": "f978", "dtype": "float64"}, {"name": "f979", "dtype": "float64"}, {"name": "f980", "dtype": "float64"}, {"name": "f981", "dtype": "float64"}, {"name": "f982", "dtype": "float64"}, {"name": "f983", "dtype": "float64"}, {"name": "f984", "dtype": "float64"}, {"name": "f985", "dtype": "float64"}, {"name": "f986", "dtype": "float64"}, {"name": "f987", "dtype": "float64"}, {"name": "f988", "dtype": "float64"}, {"name": "f989", "dtype": "float64"}, {"name": "f990", "dtype": "float64"}, {"name": "f991", "dtype": "float64"}, {"name": "f992", "dtype": "float64"}, {"name": "f993", "dtype": "float64"}, {"name": "f994", "dtype": "float64"}, {"name": "f995", "dtype": "float64"}, {"name": "f996", "dtype": "float64"}, {"name": "f997", "dtype": "float64"}, {"name": "f998", "dtype": "float64"}, {"name": "f999", "dtype": "float64"}, {"name": "f1000", "dtype": "float64"}, {"name": "f1001", "dtype": "float64"}, {"name": "f1002", "dtype": "float64"}, {"name": "f1003", "dtype": "float64"}, {"name": "f1004", "dtype": "float64"}, {"name": "f1005", "dtype": "float64"}, {"name": "f1006", "dtype": "float64"}, {"name": "f1007", "dtype": "float64"}, {"name": "f1008", "dtype": "float64"}, {"name": "f1009", "dtype": "float64"}, {"name": "f1010", "dtype": "float64"}, {"name": "f1011", "dtype": "float64"}, {"name": "f1012", "dtype": "float64"}, {"name": "f1013", "dtype": "float64"}, {"name": "f1014", "dtype": "float64"}, {"name": "f1015", "dtype": "float64"}, {"name": "f1016", "dtype": "float64"}, {"name": "f1017", "dtype": "float64"}, {"name": "f1018", "dtype": "float64"}, {"name": "f1019", "dtype": "float64"}, {"name": "f1020", "dtype": "float64"}, {"name": "f1021", "dtype": "float64"}, {"name": "f1022", "dtype": "float64"}, {"name": "f1023", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 128590756, "num_examples": 15000}], "download_size": 151947227, "dataset_size": 128590756}} | 2024-02-13T22:21:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset of proteins created by diffusion models | [
"# Dataset of proteins created by diffusion models"
] | [
"TAGS\n#region-us \n",
"# Dataset of proteins created by diffusion models"
] |
5c7690518e481375551916f24241048cf7b017d0 |
# Ko-miracl
This dataset represents a conversion of the Korean (Ko) section from the [miracl dataset](https://huggingface.co/datasets/miracl/miracl) into the [BeIR](https://github.com/beir-cellar/beir) format, making it compatible for use with [mteb](https://github.com/embeddings-benchmark/mteb). | taeminlee/Ko-miracl | [
"task_categories:text-retrieval",
"task_ids:document-retrieval",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"source_datasets:miracl",
"language:ko",
"text-retrieval",
"region:us"
] | 2024-01-16T08:46:29+00:00 | {"language": ["ko"], "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "source_datasets": ["miracl"], "task_categories": ["text-retrieval"], "task_ids": ["document-retrieval"], "config_names": ["corpus"], "tags": ["text-retrieval"], "dataset_info": [{"config_name": "default", "features": [{"name": "query-id", "dtype": "string"}, {"name": "corpus-id", "dtype": "string"}, {"name": "score", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 347785, "num_examples": 12767}, {"name": "dev", "num_bytes": 83188, "num_examples": 3057}]}, {"config_name": "corpus", "features": [{"name": "_id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "corpus", "num_bytes": 633206834, "num_examples": 1486752}]}, {"config_name": "queries", "features": [{"name": "_id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "queries", "num_bytes": 174597, "num_examples": 2761}]}], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "qrels/train.jsonl"}, {"split": "dev", "path": "qrels/dev.jsonl"}]}, {"config_name": "corpus", "data_files": [{"split": "corpus", "path": "corpus.jsonl"}]}, {"config_name": "queries", "data_files": [{"split": "queries", "path": "queries.jsonl"}]}]} | 2024-01-19T02:13:05+00:00 | [] | [
"ko"
] | TAGS
#task_categories-text-retrieval #task_ids-document-retrieval #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-miracl #language-Korean #text-retrieval #region-us
|
# Ko-miracl
This dataset represents a conversion of the Korean (Ko) section from the miracl dataset into the BeIR format, making it compatible for use with mteb. | [
"# Ko-miracl\n\nThis dataset represents a conversion of the Korean (Ko) section from the miracl dataset into the BeIR format, making it compatible for use with mteb."
] | [
"TAGS\n#task_categories-text-retrieval #task_ids-document-retrieval #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-miracl #language-Korean #text-retrieval #region-us \n",
"# Ko-miracl\n\nThis dataset represents a conversion of the Korean (Ko) section from the miracl dataset into the BeIR format, making it compatible for use with mteb."
] |
fcefd5125164191c2c3bbc140b04dd28e9e98fa9 |
# first try
transfer amazing dataset | joey1895/new03 | [
"license:apache-2.0",
"region:us"
] | 2024-01-16T08:53:44+00:00 | {"license": "apache-2.0", "configs": [{"config_name": "new03", "data_files": [{"split": "train", "path": "train-00000-of-00001.parquet"}, {"split": "test", "path": "test-00000-of-00001.parquet"}, {"split": "validation", "path": "validation-00000-of-00001.parquet"}]}]} | 2024-01-16T09:05:06+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
# first try
transfer amazing dataset | [
"# first try\ntransfer amazing dataset"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# first try\ntransfer amazing dataset"
] |
55cab69588db84844d95092fd8bf9c0db2ae9922 |
The raw data has been curated from <https://www.st-minutiae.com/resources/scripts/#thenextgeneration>
<br>
This has been cleaned using the code found in <https://github.com/progs2002/StarTrekTNG-ScriptGenerator/blob/master/process_data.py>
<br>
The text files have been assigned to training and testing in an 80-20 split. | progs2002/star-trek-tng-scripts | [
"license:mit",
"region:us"
] | 2024-01-16T08:55:10+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7768730, "num_examples": 156}, {"name": "test", "num_bytes": 850567, "num_examples": 18}], "download_size": 5081082, "dataset_size": 8619297}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-21T07:23:40+00:00 | [] | [] | TAGS
#license-mit #region-us
|
The raw data has been curated from <URL
<br>
This has been cleaned using the code found in <URL
<br>
The text files have been assigned to training and testing in an 80-20 split. | [] | [
"TAGS\n#license-mit #region-us \n"
] |
df2d298f23ecf6275daefaba6298466da157d056 | ### Please give the repo a :star:
| Build | Support Server |
|-------|---------|
| [](https://github.com/keiyoushi/extensions-source/actions/workflows/build_push.yml) | [](https://discord.gg/3FbCpdKbdY) |
# Usage
https://github.com/keiyoushi/extensions/blob/main/README.md
# Contributing
Contributions are welcome!
Check out the repo's [issue backlog](https://github.com/keiyoushi/extensions-source/issues) for source requests and bug reports.
## License
Copyright 2015 Javier Tomás
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
## Disclaimer
This project is not affiliated with Tachiyomi. Don't ask for help about these extensions at the official support means of Tachiyomi. All credits to the codebase goes to the original contributors.
| makisekurisu-jp/extensions-source | [
"region:us"
] | 2024-01-16T09:01:44+00:00 | {} | 2024-01-16T09:16:44+00:00 | [] | [] | TAGS
#region-us
| ### Please give the repo a :star:
Usage
=====
URL
Contributing
============
Contributions are welcome!
Check out the repo's issue backlog for source requests and bug reports.
License
-------
```
Copyright 2015 Javier Tomás
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
URL
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
```
Disclaimer
----------
This project is not affiliated with Tachiyomi. Don't ask for help about these extensions at the official support means of Tachiyomi. All credits to the codebase goes to the original contributors.
| [
"### Please give the repo a :star:\n\n\n\nUsage\n=====\n\n\nURL\n\n\nContributing\n============\n\n\nContributions are welcome!\n\n\nCheck out the repo's issue backlog for source requests and bug reports.\n\n\nLicense\n-------\n\n\n\n```\nCopyright 2015 Javier Tomás\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\nURL\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n\n```\n\nDisclaimer\n----------\n\n\nThis project is not affiliated with Tachiyomi. Don't ask for help about these extensions at the official support means of Tachiyomi. All credits to the codebase goes to the original contributors."
] | [
"TAGS\n#region-us \n",
"### Please give the repo a :star:\n\n\n\nUsage\n=====\n\n\nURL\n\n\nContributing\n============\n\n\nContributions are welcome!\n\n\nCheck out the repo's issue backlog for source requests and bug reports.\n\n\nLicense\n-------\n\n\n\n```\nCopyright 2015 Javier Tomás\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\nURL\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n\n```\n\nDisclaimer\n----------\n\n\nThis project is not affiliated with Tachiyomi. Don't ask for help about these extensions at the official support means of Tachiyomi. All credits to the codebase goes to the original contributors."
] |
5780d878c3601b9ea458fe208229add03d2a21f2 | # Dataset Card for "esc50_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Codec-SUPERB/esc50_synth | [
"region:us"
] | 2024-01-16T09:10:02+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "original", "path": "data/original-*"}, {"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k_12bps", "path": "data/encodec_24k_12bps-*"}, {"split": "encodec_24k_1_5bps", "path": "data/encodec_24k_1_5bps-*"}, {"split": "encodec_24k_24bps", "path": "data/encodec_24k_24bps-*"}, {"split": "encodec_24k_3bps", "path": "data/encodec_24k_3bps-*"}, {"split": "encodec_24k_6bps", "path": "data/encodec_24k_6bps-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 48000}}}, {"name": "id", "dtype": "string"}], "splits": [{"name": "original", "num_bytes": 960127258.0, "num_examples": 2000}, {"name": "academicodec_hifi_16k_320d", "num_bytes": 320129480.0, "num_examples": 2000}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 320129480.0, "num_examples": 2000}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 480129480.0, "num_examples": 2000}, {"name": "audiodec_24k_320d", "num_bytes": 480129480.0, "num_examples": 2000}, {"name": "dac_16k", "num_bytes": 320129480.0, "num_examples": 2000}, {"name": "dac_24k", "num_bytes": 480129480.0, "num_examples": 2000}, {"name": "dac_44k", "num_bytes": 882129480.0, "num_examples": 2000}, {"name": "encodec_24k_12bps", "num_bytes": 480129480.0, "num_examples": 2000}, {"name": "encodec_24k_1_5bps", "num_bytes": 480129480.0, "num_examples": 2000}, {"name": "encodec_24k_24bps", "num_bytes": 480129480.0, "num_examples": 2000}, {"name": "encodec_24k_3bps", "num_bytes": 480129480.0, "num_examples": 2000}, {"name": "encodec_24k_6bps", "num_bytes": 480129480.0, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 320129480.0, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 320129480.0, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 320129480.0, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 320129480.0, "num_examples": 2000}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 320129480.0, "num_examples": 2000}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 320129480.0, "num_examples": 2000}, {"name": "speech_tokenizer_16k", "num_bytes": 320129480.0, "num_examples": 2000}], "download_size": 7976139767, "dataset_size": 8884587378.0}} | 2024-01-28T02:37:57+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "esc50_synth"
More Information needed | [
"# Dataset Card for \"esc50_synth\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"esc50_synth\"\n\nMore Information needed"
] |
ef3a18a10c3d954949a9aa50b6622373cc1e4c0f | # Dataset Card for "DUC2004"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mtc/DUC2004 | [
"region:us"
] | 2024-01-16T09:27:31+00:00 | {"dataset_info": {"features": [{"name": "document", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7368680, "num_examples": 200}], "download_size": 1033281, "dataset_size": 7368680}} | 2024-01-16T09:27:33+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "DUC2004"
More Information needed | [
"# Dataset Card for \"DUC2004\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"DUC2004\"\n\nMore Information needed"
] |
6fbd37b716deb66b097418c2bd317b859ea2bc94 |
# Dataset Card for Evaluation run of Technoculture/Medtulu-4x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/Medtulu-4x7B](https://huggingface.co/Technoculture/Medtulu-4x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__Medtulu-4x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T09:26:06.099420](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medtulu-4x7B/blob/main/results_2024-01-16T09-26-06.099420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2441106685479756,
"acc_stderr": 0.030388013771384576,
"acc_norm": 0.24501971068706568,
"acc_norm_stderr": 0.031199333244496447,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080522,
"mc2": 0.47911756406040795,
"mc2_stderr": 0.016890966208763153
},
"harness|arc:challenge|25": {
"acc": 0.21928327645051193,
"acc_stderr": 0.012091245787615707,
"acc_norm": 0.28754266211604096,
"acc_norm_stderr": 0.01322671905626613
},
"harness|hellaswag|10": {
"acc": 0.2559251145190201,
"acc_stderr": 0.004354881005789727,
"acc_norm": 0.2574188408683529,
"acc_norm_stderr": 0.004363185172047182
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517905,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517905
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610645,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610645
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.02560423347089911,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.02560423347089911
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.038009680605548574,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.038009680605548574
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.037082846624165444,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.037082846624165444
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412424,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708624,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708624
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.02354079935872331,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.02354079935872331
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.029678333141444437,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.029678333141444437
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.02184086699042308,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.02184086699042308
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.02564947026588919,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.02564947026588919
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17880794701986755,
"acc_stderr": 0.031287448506007245,
"acc_norm": 0.17880794701986755,
"acc_norm_stderr": 0.031287448506007245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22752293577981653,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.22752293577981653,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03005820270430985,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03005820270430985
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20253164556962025,
"acc_stderr": 0.026160568246601457,
"acc_norm": 0.20253164556962025,
"acc_norm_stderr": 0.026160568246601457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.22869955156950672,
"acc_stderr": 0.028188240046929196,
"acc_norm": 0.22869955156950672,
"acc_norm_stderr": 0.028188240046929196
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529615,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529615
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468645,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824768,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824768
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250885,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250885
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26010430247718386,
"acc_stderr": 0.011204382887823834,
"acc_norm": 0.26010430247718386,
"acc_norm_stderr": 0.011204382887823834
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16544117647058823,
"acc_stderr": 0.022571771025494767,
"acc_norm": 0.16544117647058823,
"acc_norm_stderr": 0.022571771025494767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594726,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1746987951807229,
"acc_stderr": 0.02956032621125685,
"acc_norm": 0.1746987951807229,
"acc_norm_stderr": 0.02956032621125685
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080522,
"mc2": 0.47911756406040795,
"mc2_stderr": 0.016890966208763153
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.014051956064076918
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Technoculture__Medtulu-4x7B | [
"region:us"
] | 2024-01-16T09:28:28+00:00 | {"pretty_name": "Evaluation run of Technoculture/Medtulu-4x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Technoculture/Medtulu-4x7B](https://huggingface.co/Technoculture/Medtulu-4x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__Medtulu-4x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T09:26:06.099420](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medtulu-4x7B/blob/main/results_2024-01-16T09-26-06.099420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2441106685479756,\n \"acc_stderr\": 0.030388013771384576,\n \"acc_norm\": 0.24501971068706568,\n \"acc_norm_stderr\": 0.031199333244496447,\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080522,\n \"mc2\": 0.47911756406040795,\n \"mc2_stderr\": 0.016890966208763153\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21928327645051193,\n \"acc_stderr\": 0.012091245787615707,\n \"acc_norm\": 0.28754266211604096,\n \"acc_norm_stderr\": 0.01322671905626613\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2559251145190201,\n \"acc_stderr\": 0.004354881005789727,\n \"acc_norm\": 0.2574188408683529,\n \"acc_norm_stderr\": 0.004363185172047182\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03820169914517905,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03820169914517905\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610645,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610645\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.02560423347089911,\n \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.02560423347089911\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.038009680605548574,\n \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.038009680605548574\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.037082846624165444,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.037082846624165444\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412424,\n \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n \"acc_stderr\": 0.02354079935872331,\n \"acc_norm\": 0.21935483870967742,\n \"acc_norm_stderr\": 0.02354079935872331\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444437,\n \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444437\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.20707070707070707,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.02184086699042308,\n \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.02184086699042308\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.02564947026588919,\n \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.02564947026588919\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.17880794701986755,\n \"acc_stderr\": 0.031287448506007245,\n \"acc_norm\": 0.17880794701986755,\n \"acc_norm_stderr\": 0.031287448506007245\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22752293577981653,\n \"acc_stderr\": 0.017974463578776502,\n \"acc_norm\": 0.22752293577981653,\n \"acc_norm_stderr\": 0.017974463578776502\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03005820270430985,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03005820270430985\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.22869955156950672,\n \"acc_stderr\": 0.028188240046929196,\n \"acc_norm\": 0.22869955156950672,\n \"acc_norm_stderr\": 0.028188240046929196\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.19444444444444445,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n \"acc_stderr\": 0.015517322365529615,\n \"acc_norm\": 0.2515964240102171,\n \"acc_norm_stderr\": 0.015517322365529615\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n \"acc_stderr\": 0.014530330201468645,\n \"acc_norm\": 0.25251396648044694,\n \"acc_norm_stderr\": 0.014530330201468645\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824768,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824768\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n \"acc_stderr\": 0.024826171289250885,\n \"acc_norm\": 0.2572347266881029,\n \"acc_norm_stderr\": 0.024826171289250885\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.024383665531035457,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.024383665531035457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26010430247718386,\n \"acc_stderr\": 0.011204382887823834,\n \"acc_norm\": 0.26010430247718386,\n \"acc_norm_stderr\": 0.011204382887823834\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494767,\n \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494767\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594726,\n \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594726\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1746987951807229,\n \"acc_stderr\": 0.02956032621125685,\n \"acc_norm\": 0.1746987951807229,\n \"acc_norm_stderr\": 0.02956032621125685\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080522,\n \"mc2\": 0.47911756406040795,\n \"mc2_stderr\": 0.016890966208763153\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.014051956064076918\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Technoculture/Medtulu-4x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|arc:challenge|25_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|gsm8k|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hellaswag|10_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T09-26-06.099420.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["**/details_harness|winogrande|5_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T09-26-06.099420.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T09_26_06.099420", "path": ["results_2024-01-16T09-26-06.099420.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T09-26-06.099420.parquet"]}]}]} | 2024-01-16T09:28:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Technoculture/Medtulu-4x7B
Dataset automatically created during the evaluation run of model Technoculture/Medtulu-4x7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T09:26:06.099420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Technoculture/Medtulu-4x7B\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Medtulu-4x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T09:26:06.099420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Technoculture/Medtulu-4x7B\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Medtulu-4x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T09:26:06.099420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
eae7a6e33a4c8e79cc0860d87deb2cd487e1ac01 |
# Dataset Card for Evaluation run of Technoculture/Medorca-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/Medorca-4x7b](https://huggingface.co/Technoculture/Medorca-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__Medorca-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T09:31:13.388605](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medorca-4x7b/blob/main/results_2024-01-16T09-31-13.388605.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24277591604603688,
"acc_stderr": 0.030362031381414405,
"acc_norm": 0.2439454714476596,
"acc_norm_stderr": 0.031170707455863384,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871093,
"mc2": 0.484210555153098,
"mc2_stderr": 0.016831351518748705
},
"harness|arc:challenge|25": {
"acc": 0.2295221843003413,
"acc_stderr": 0.012288926760890776,
"acc_norm": 0.2935153583617747,
"acc_norm_stderr": 0.013307250444941117
},
"harness|hellaswag|10": {
"acc": 0.25473013343955386,
"acc_stderr": 0.0043481894593365355,
"acc_norm": 0.25721967735510853,
"acc_norm_stderr": 0.004362081806560237
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003337,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003337
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102148,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102148
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27419354838709675,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.27419354838709675,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617732,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617732
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.02869787397186067,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.02869787397186067
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.021107730127243998,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.021107730127243998
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.02513045365226846,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.02513045365226846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529624,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529624
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574882,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574882
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.02671143055553842,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.02671143055553842
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.19883040935672514,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.19883040935672514,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871093,
"mc2": 0.484210555153098,
"mc2_stderr": 0.016831351518748705
},
"harness|winogrande|5": {
"acc": 0.48303078137332284,
"acc_stderr": 0.014044390401612976
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Technoculture__Medorca-4x7b | [
"region:us"
] | 2024-01-16T09:33:32+00:00 | {"pretty_name": "Evaluation run of Technoculture/Medorca-4x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Technoculture/Medorca-4x7b](https://huggingface.co/Technoculture/Medorca-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__Medorca-4x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T09:31:13.388605](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medorca-4x7b/blob/main/results_2024-01-16T09-31-13.388605.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24277591604603688,\n \"acc_stderr\": 0.030362031381414405,\n \"acc_norm\": 0.2439454714476596,\n \"acc_norm_stderr\": 0.031170707455863384,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871093,\n \"mc2\": 0.484210555153098,\n \"mc2_stderr\": 0.016831351518748705\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2295221843003413,\n \"acc_stderr\": 0.012288926760890776,\n \"acc_norm\": 0.2935153583617747,\n \"acc_norm_stderr\": 0.013307250444941117\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25473013343955386,\n \"acc_stderr\": 0.0043481894593365355,\n \"acc_norm\": 0.25721967735510853,\n \"acc_norm_stderr\": 0.004362081806560237\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003337,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003337\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n \"acc_stderr\": 0.03455071019102148,\n \"acc_norm\": 0.18253968253968253,\n \"acc_norm_stderr\": 0.03455071019102148\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27419354838709675,\n \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.27419354838709675,\n \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617732,\n \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617732\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.02869787397186067,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.02869787397186067\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.021107730127243998,\n \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.021107730127243998\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n \"acc_stderr\": 0.015517322365529624,\n \"acc_norm\": 0.2515964240102171,\n \"acc_norm_stderr\": 0.015517322365529624\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574882,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574882\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.02671143055553842,\n \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.02671143055553842\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.19883040935672514,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.19883040935672514,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871093,\n \"mc2\": 0.484210555153098,\n \"mc2_stderr\": 0.016831351518748705\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.48303078137332284,\n \"acc_stderr\": 0.014044390401612976\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Technoculture/Medorca-4x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|arc:challenge|25_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|gsm8k|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hellaswag|10_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T09-31-13.388605.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["**/details_harness|winogrande|5_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T09-31-13.388605.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T09_31_13.388605", "path": ["results_2024-01-16T09-31-13.388605.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T09-31-13.388605.parquet"]}]}]} | 2024-01-16T09:33:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Technoculture/Medorca-4x7b
Dataset automatically created during the evaluation run of model Technoculture/Medorca-4x7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T09:31:13.388605(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Technoculture/Medorca-4x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Medorca-4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T09:31:13.388605(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Technoculture/Medorca-4x7b\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/Medorca-4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T09:31:13.388605(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
da3e9e347cbf37c4eba988128fbfd0f34ed409bb |
# PCBM Metashift
For the sake of reproducibility, this dataset hosts the postprocessed Metashift according to [[Yuksekgonul et al.]](https://arxiv.org/pdf/2205.15480.pdf) for the use in Post-Hoc Concept Bottleneck Models.
| Config Name | Description |
|---|---|
| `task_1_bed_cat_dog` | Task 1: bed(cat) -> bed(dog) |
| `task_1_bed_dog_cat` | Task 1: bed(dog) -> bed(cat) |
| `task_2_table_books_cat` | Task 2: table(books) -> table(cat) |
| `task_2_table_books_dog` | Task 2: table(books) -> table(dog) |
| `task_2_table_cat_dog` | Task 2: table(cat) -> table(dog) |
| `task_2_table_dog_cat` | Task 2: table(dog) -> table(cat) |
The script to generate this dataset can be found at `scripts/generate.py`. You will need to download the [Metashift repo](https://github.com/Weixin-Liang/MetaShift) and the [Visual Genome dataset](https://nlp.stanford.edu/data/gqa/images.zip) as instructed in the Metashift repo.
| anonymous347928/pcbm_metashift | [
"task_categories:image-classification",
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"arxiv:2205.15480",
"region:us"
] | 2024-01-16T09:44:29+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["image-classification"], "pretty_name": "Metashift subset for PCBM reproduction", "viewer": false, "dataset_info": [{"config_name": "cherrypicked_task_1_bed_cat_dog", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "airplane", "1": "bed", "2": "car", "3": "cow", "4": "keyboard"}}}}], "splits": [{"name": "train", "num_bytes": 28494, "num_examples": 500}, {"name": "test", "num_bytes": 28486, "num_examples": 500}], "download_size": 477673284, "dataset_size": 56980}, {"config_name": "cherrypicked_task_1_bed_dog_cat", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "airplane", "1": "bed", "2": "car", "3": "cow", "4": "keyboard"}}}}], "splits": [{"name": "train", "num_bytes": 28490, "num_examples": 500}, {"name": "test", "num_bytes": 28478, "num_examples": 500}], "download_size": 477673272, "dataset_size": 56968}, {"config_name": "cherrypicked_task_2_table_books_cat", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "beach", "1": "computer", "2": "motorcycle", "3": "stove", "4": "table"}}}}], "splits": [{"name": "train", "num_bytes": 28413, "num_examples": 500}, {"name": "test", "num_bytes": 28478, "num_examples": 500}], "download_size": 477673223, "dataset_size": 56891}, {"config_name": "cherrypicked_task_2_table_books_dog", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "beach", "1": "computer", "2": "motorcycle", "3": "stove", "4": "table"}}}}], "splits": [{"name": "train", "num_bytes": 28411, "num_examples": 500}, {"name": "test", "num_bytes": 28477, "num_examples": 500}], "download_size": 477673220, "dataset_size": 56888}, {"config_name": "cherrypicked_task_2_table_cat_dog", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "beach", "1": "computer", "2": "motorcycle", "3": "stove", "4": "table"}}}}], "splits": [{"name": "train", "num_bytes": 28477, "num_examples": 500}, {"name": "test", "num_bytes": 28485, "num_examples": 500}], "download_size": 477673292, "dataset_size": 56962}, {"config_name": "cherrypicked_task_2_table_dog_cat", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "beach", "1": "computer", "2": "motorcycle", "3": "stove", "4": "table"}}}}], "splits": [{"name": "train", "num_bytes": 28476, "num_examples": 500}, {"name": "test", "num_bytes": 28484, "num_examples": 500}], "download_size": 477673290, "dataset_size": 56960}, {"config_name": "seed42_task_1_bed_cat_dog", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "airplane", "1": "bed", "2": "car", "3": "cow", "4": "keyboard"}}}}], "splits": [{"name": "train", "num_bytes": 28498, "num_examples": 500}, {"name": "test", "num_bytes": 28480, "num_examples": 500}], "download_size": 477673282, "dataset_size": 56978}, {"config_name": "seed42_task_1_bed_dog_cat", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "airplane", "1": "bed", "2": "car", "3": "cow", "4": "keyboard"}}}}], "splits": [{"name": "train", "num_bytes": 28501, "num_examples": 500}, {"name": "test", "num_bytes": 28485, "num_examples": 500}], "download_size": 477673290, "dataset_size": 56986}, {"config_name": "seed42_task_2_table_books_cat", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "beach", "1": "computer", "2": "motorcycle", "3": "stove", "4": "table"}}}}], "splits": [{"name": "train", "num_bytes": 28434, "num_examples": 500}, {"name": "test", "num_bytes": 28481, "num_examples": 500}], "download_size": 477673247, "dataset_size": 56915}, {"config_name": "seed42_task_2_table_books_dog", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "beach", "1": "computer", "2": "motorcycle", "3": "stove", "4": "table"}}}}], "splits": [{"name": "train", "num_bytes": 28434, "num_examples": 500}, {"name": "test", "num_bytes": 28479, "num_examples": 500}], "download_size": 477673245, "dataset_size": 56913}, {"config_name": "seed42_task_2_table_cat_dog", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "beach", "1": "computer", "2": "motorcycle", "3": "stove", "4": "table"}}}}], "splits": [{"name": "train", "num_bytes": 28465, "num_examples": 500}, {"name": "test", "num_bytes": 28479, "num_examples": 500}], "download_size": 477673274, "dataset_size": 56944}, {"config_name": "seed42_task_2_table_dog_cat", "features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "beach", "1": "computer", "2": "motorcycle", "3": "stove", "4": "table"}}}}], "splits": [{"name": "train", "num_bytes": 28463, "num_examples": 500}, {"name": "test", "num_bytes": 28481, "num_examples": 500}], "download_size": 477673274, "dataset_size": 56944}]} | 2024-02-15T15:03:34+00:00 | [
"2205.15480"
] | [
"en"
] | TAGS
#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-mit #arxiv-2205.15480 #region-us
| PCBM Metashift
==============
For the sake of reproducibility, this dataset hosts the postprocessed Metashift according to [[Yuksekgonul et al.]](URL for the use in Post-Hoc Concept Bottleneck Models.
The script to generate this dataset can be found at 'scripts/URL'. You will need to download the Metashift repo and the Visual Genome dataset as instructed in the Metashift repo.
| [] | [
"TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-mit #arxiv-2205.15480 #region-us \n"
] |
39d8e0c0ed9b988ac991916e227273ffeae748b2 | # Dataset Card for "myridade_dbg_aligned_ontologie_filter_myriade"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/myridade_dbg_aligned_ontologie_filter_myriade | [
"region:us"
] | 2024-01-16T10:00:38+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 47868666, "num_examples": 98206}], "download_size": 11206988, "dataset_size": 47868666}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T10:00:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "myridade_dbg_aligned_ontologie_filter_myriade"
More Information needed | [
"# Dataset Card for \"myridade_dbg_aligned_ontologie_filter_myriade\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"myridade_dbg_aligned_ontologie_filter_myriade\"\n\nMore Information needed"
] |
7c6d270271c45d732087ab1cdd0566ef51e89131 |
Dataset created for accelerated processing. Embeddings from this fine model:
- Locutusque/TinyMistral-248M-Instruct | jtatman/tinymistral-hypnosis-instruct-preprocessed | [
"task_categories:question-answering",
"task_categories:text-generation",
"task_categories:conversational",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"health",
"medical",
"therapy",
"hypnosis",
"region:us"
] | 2024-01-16T10:18:55+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["question-answering", "text-generation", "conversational"], "pretty_name": "hypnosis instruct ", "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 1031463407.6346276, "num_examples": 2832454}, {"name": "eval", "num_bytes": 9103.973159269555, "num_examples": 25}], "download_size": 307894933, "dataset_size": 1031472511.6077869}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "eval", "path": "data/eval-*"}]}], "tags": ["health", "medical", "therapy", "hypnosis"]} | 2024-01-16T17:59:33+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #task_categories-conversational #size_categories-1M<n<10M #language-English #license-apache-2.0 #health #medical #therapy #hypnosis #region-us
|
Dataset created for accelerated processing. Embeddings from this fine model:
- Locutusque/TinyMistral-248M-Instruct | [] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #task_categories-conversational #size_categories-1M<n<10M #language-English #license-apache-2.0 #health #medical #therapy #hypnosis #region-us \n"
] |
bc153e799a128a855ad1a311e7cf61cb5ee9aba3 |
# Dataset Card for Evaluation run of qnguyen3/quan-1.8b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [qnguyen3/quan-1.8b-chat](https://huggingface.co/qnguyen3/quan-1.8b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_qnguyen3__quan-1.8b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T11:13:19.271174](https://huggingface.co/datasets/open-llm-leaderboard/details_qnguyen3__quan-1.8b-chat/blob/main/results_2024-01-16T11-13-19.271174.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4400975999737303,
"acc_stderr": 0.0345967614345703,
"acc_norm": 0.4431186866947614,
"acc_norm_stderr": 0.03532660922667111,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.4314996062576424,
"mc2_stderr": 0.015306262833109105
},
"harness|arc:challenge|25": {
"acc": 0.3728668941979522,
"acc_stderr": 0.014131176760131165,
"acc_norm": 0.39078498293515357,
"acc_norm_stderr": 0.014258563880513778
},
"harness|hellaswag|10": {
"acc": 0.47560246962756425,
"acc_stderr": 0.004983837641502896,
"acc_norm": 0.6236805417247561,
"acc_norm_stderr": 0.00483471581420811
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.030770900763851316,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.030770900763851316
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523867,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523867
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6113989637305699,
"acc_stderr": 0.035177397963731316,
"acc_norm": 0.6113989637305699,
"acc_norm_stderr": 0.035177397963731316
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.40512820512820513,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.40512820512820513,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5467889908256881,
"acc_stderr": 0.021343255165546037,
"acc_norm": 0.5467889908256881,
"acc_norm_stderr": 0.021343255165546037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.03508637358630572,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.03508637358630572
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5316455696202531,
"acc_stderr": 0.032481974005110756,
"acc_norm": 0.5316455696202531,
"acc_norm_stderr": 0.032481974005110756
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4732824427480916,
"acc_stderr": 0.04379024936553893,
"acc_norm": 0.4732824427480916,
"acc_norm_stderr": 0.04379024936553893
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4723926380368098,
"acc_stderr": 0.039223782906109894,
"acc_norm": 0.4723926380368098,
"acc_norm_stderr": 0.039223782906109894
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6581196581196581,
"acc_stderr": 0.03107502852650775,
"acc_norm": 0.6581196581196581,
"acc_norm_stderr": 0.03107502852650775
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5644955300127714,
"acc_stderr": 0.017730589927926588,
"acc_norm": 0.5644955300127714,
"acc_norm_stderr": 0.017730589927926588
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.026911898686377927,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.026911898686377927
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.028555827516528787,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.028555827516528787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.45016077170418006,
"acc_stderr": 0.028256660723360184,
"acc_norm": 0.45016077170418006,
"acc_norm_stderr": 0.028256660723360184
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34485006518904826,
"acc_stderr": 0.012139881006287058,
"acc_norm": 0.34485006518904826,
"acc_norm_stderr": 0.012139881006287058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3786764705882353,
"acc_stderr": 0.029465133639776132,
"acc_norm": 0.3786764705882353,
"acc_norm_stderr": 0.029465133639776132
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.01988622103750188,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.01988622103750188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3781094527363184,
"acc_stderr": 0.03428867848778658,
"acc_norm": 0.3781094527363184,
"acc_norm_stderr": 0.03428867848778658
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.037777988227480165,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.037777988227480165
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5146198830409356,
"acc_stderr": 0.038331852752130254,
"acc_norm": 0.5146198830409356,
"acc_norm_stderr": 0.038331852752130254
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.4314996062576424,
"mc2_stderr": 0.015306262833109105
},
"harness|winogrande|5": {
"acc": 0.5927387529597474,
"acc_stderr": 0.013808654122417848
},
"harness|gsm8k|5": {
"acc": 0.27520849128127367,
"acc_stderr": 0.012302114305862647
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_qnguyen3__quan-1.8b-chat | [
"region:us"
] | 2024-01-16T11:15:31+00:00 | {"pretty_name": "Evaluation run of qnguyen3/quan-1.8b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [qnguyen3/quan-1.8b-chat](https://huggingface.co/qnguyen3/quan-1.8b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qnguyen3__quan-1.8b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T11:13:19.271174](https://huggingface.co/datasets/open-llm-leaderboard/details_qnguyen3__quan-1.8b-chat/blob/main/results_2024-01-16T11-13-19.271174.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4400975999737303,\n \"acc_stderr\": 0.0345967614345703,\n \"acc_norm\": 0.4431186866947614,\n \"acc_norm_stderr\": 0.03532660922667111,\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.4314996062576424,\n \"mc2_stderr\": 0.015306262833109105\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3728668941979522,\n \"acc_stderr\": 0.014131176760131165,\n \"acc_norm\": 0.39078498293515357,\n \"acc_norm_stderr\": 0.014258563880513778\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47560246962756425,\n \"acc_stderr\": 0.004983837641502896,\n \"acc_norm\": 0.6236805417247561,\n \"acc_norm_stderr\": 0.00483471581420811\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249033,\n \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249033\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851316,\n \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851316\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523867,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523867\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6113989637305699,\n \"acc_stderr\": 0.035177397963731316,\n \"acc_norm\": 0.6113989637305699,\n \"acc_norm_stderr\": 0.035177397963731316\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.40512820512820513,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.40512820512820513,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5467889908256881,\n \"acc_stderr\": 0.021343255165546037,\n \"acc_norm\": 0.5467889908256881,\n \"acc_norm_stderr\": 0.021343255165546037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.03508637358630572,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.03508637358630572\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5316455696202531,\n \"acc_stderr\": 0.032481974005110756,\n \"acc_norm\": 0.5316455696202531,\n \"acc_norm_stderr\": 0.032481974005110756\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4732824427480916,\n \"acc_stderr\": 0.04379024936553893,\n \"acc_norm\": 0.4732824427480916,\n \"acc_norm_stderr\": 0.04379024936553893\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4723926380368098,\n \"acc_stderr\": 0.039223782906109894,\n \"acc_norm\": 0.4723926380368098,\n \"acc_norm_stderr\": 0.039223782906109894\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6581196581196581,\n \"acc_stderr\": 0.03107502852650775,\n \"acc_norm\": 0.6581196581196581,\n \"acc_norm_stderr\": 0.03107502852650775\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5644955300127714,\n \"acc_stderr\": 0.017730589927926588,\n \"acc_norm\": 0.5644955300127714,\n \"acc_norm_stderr\": 0.017730589927926588\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377927,\n \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377927\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.028555827516528787,\n \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.028555827516528787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.45016077170418006,\n \"acc_stderr\": 0.028256660723360184,\n \"acc_norm\": 0.45016077170418006,\n \"acc_norm_stderr\": 0.028256660723360184\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.027744313443376536,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.027744313443376536\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34485006518904826,\n \"acc_stderr\": 0.012139881006287058,\n \"acc_norm\": 0.34485006518904826,\n \"acc_norm_stderr\": 0.012139881006287058\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3786764705882353,\n \"acc_stderr\": 0.029465133639776132,\n \"acc_norm\": 0.3786764705882353,\n \"acc_norm_stderr\": 0.029465133639776132\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4084967320261438,\n \"acc_stderr\": 0.01988622103750188,\n \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.01988622103750188\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3781094527363184,\n \"acc_stderr\": 0.03428867848778658,\n \"acc_norm\": 0.3781094527363184,\n \"acc_norm_stderr\": 0.03428867848778658\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n \"acc_stderr\": 0.037777988227480165,\n \"acc_norm\": 0.3795180722891566,\n \"acc_norm_stderr\": 0.037777988227480165\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5146198830409356,\n \"acc_stderr\": 0.038331852752130254,\n \"acc_norm\": 0.5146198830409356,\n \"acc_norm_stderr\": 0.038331852752130254\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.4314996062576424,\n \"mc2_stderr\": 0.015306262833109105\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5927387529597474,\n \"acc_stderr\": 0.013808654122417848\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27520849128127367,\n \"acc_stderr\": 0.012302114305862647\n }\n}\n```", "repo_url": "https://huggingface.co/qnguyen3/quan-1.8b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|arc:challenge|25_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|gsm8k|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hellaswag|10_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T11-13-19.271174.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["**/details_harness|winogrande|5_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T11-13-19.271174.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T11_13_19.271174", "path": ["results_2024-01-16T11-13-19.271174.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T11-13-19.271174.parquet"]}]}]} | 2024-01-16T11:15:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of qnguyen3/quan-1.8b-chat
Dataset automatically created during the evaluation run of model qnguyen3/quan-1.8b-chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T11:13:19.271174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of qnguyen3/quan-1.8b-chat\n\n\n\nDataset automatically created during the evaluation run of model qnguyen3/quan-1.8b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T11:13:19.271174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of qnguyen3/quan-1.8b-chat\n\n\n\nDataset automatically created during the evaluation run of model qnguyen3/quan-1.8b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T11:13:19.271174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
aeda5886e1c24aedf336f45dc0846ef3719a039c |
# Dataset Card for Evaluation run of kevin009/Llamafia
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/Llamafia](https://huggingface.co/kevin009/Llamafia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__Llamafia",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T11:18:55.714824](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__Llamafia/blob/main/results_2024-01-16T11-18-55.714824.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6211061177095942,
"acc_stderr": 0.03253280765271219,
"acc_norm": 0.6223051208563235,
"acc_norm_stderr": 0.03319508878633904,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.47938272596576464,
"mc2_stderr": 0.01507589659584474
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759082,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.617307309300936,
"acc_stderr": 0.004850508945116088,
"acc_norm": 0.8207528380800637,
"acc_norm_stderr": 0.0038277525727700257
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.03550683989165581,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.03550683989165581
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.04598188057816542,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.04598188057816542
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612903,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545843,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545843
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593517,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593517
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2201117318435754,
"acc_stderr": 0.013856994024227175,
"acc_norm": 0.2201117318435754,
"acc_norm_stderr": 0.013856994024227175
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904664,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291477,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.01273239828619044,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.01273239828619044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215927,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215927
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.47938272596576464,
"mc2_stderr": 0.01507589659584474
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515316
},
"harness|gsm8k|5": {
"acc": 0.6087945413191812,
"acc_stderr": 0.013442502402794302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kevin009__Llamafia | [
"region:us"
] | 2024-01-16T11:21:13+00:00 | {"pretty_name": "Evaluation run of kevin009/Llamafia", "dataset_summary": "Dataset automatically created during the evaluation run of model [kevin009/Llamafia](https://huggingface.co/kevin009/Llamafia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__Llamafia\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T11:18:55.714824](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__Llamafia/blob/main/results_2024-01-16T11-18-55.714824.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6211061177095942,\n \"acc_stderr\": 0.03253280765271219,\n \"acc_norm\": 0.6223051208563235,\n \"acc_norm_stderr\": 0.03319508878633904,\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.47938272596576464,\n \"mc2_stderr\": 0.01507589659584474\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759082,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.617307309300936,\n \"acc_stderr\": 0.004850508945116088,\n \"acc_norm\": 0.8207528380800637,\n \"acc_norm_stderr\": 0.0038277525727700257\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.03550683989165581,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.03550683989165581\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.04598188057816542,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.04598188057816542\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601688,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601688\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110932,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110932\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612903,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612903\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545843,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545843\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n \"acc_stderr\": 0.014648172749593517,\n \"acc_norm\": 0.7867177522349936,\n \"acc_norm_stderr\": 0.014648172749593517\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2201117318435754,\n \"acc_stderr\": 0.013856994024227175,\n \"acc_norm\": 0.2201117318435754,\n \"acc_norm_stderr\": 0.013856994024227175\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904664,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904664\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291477,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291477\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215927,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215927\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.47938272596576464,\n \"mc2_stderr\": 0.01507589659584474\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515316\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6087945413191812,\n \"acc_stderr\": 0.013442502402794302\n }\n}\n```", "repo_url": "https://huggingface.co/kevin009/Llamafia", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|arc:challenge|25_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|gsm8k|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hellaswag|10_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T11-18-55.714824.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["**/details_harness|winogrande|5_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T11-18-55.714824.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T11_18_55.714824", "path": ["results_2024-01-16T11-18-55.714824.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T11-18-55.714824.parquet"]}]}]} | 2024-01-16T11:21:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kevin009/Llamafia
Dataset automatically created during the evaluation run of model kevin009/Llamafia on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T11:18:55.714824(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kevin009/Llamafia\n\n\n\nDataset automatically created during the evaluation run of model kevin009/Llamafia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T11:18:55.714824(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kevin009/Llamafia\n\n\n\nDataset automatically created during the evaluation run of model kevin009/Llamafia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T11:18:55.714824(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
eec9427f05e51fdca66e0e59fdae62fdeec15307 |
# LunarLander-v2 - Imitation Learning Datasets
This is a dataset created by [Imitation Learning Datasets](https://github.com/NathanGavenski/IL-Datasets) project.
It was created by using Stable Baselines weights from a PPO policy from [HuggingFace](https://huggingface.co/sb3/ppo-LunarLander-v2).
## Description
The dataset consists of 1,000 episodes with an average episodic reward of 500.
Each entry consists of:
```
obs (list): observation with length 8.
action (int): action (0, 1, 2 and 3).
reward (float): reward point for that timestep.
episode_returns (bool): if that state was the initial timestep for an episode.
```
## Usage
Feel free to download and use the `teacher.jsonl` dataset as you please.
If you are interested in using our PyTorch Dataset implementation, feel free to check the [IL Datasets](https://github.com/NathanGavenski/IL-Datasets/blob/main/src/imitation_datasets/dataset/dataset.py) project.
There, we implement a base Dataset that downloads this dataset and all other datasets directly from HuggingFace.
The Baseline Dataset also allows for more control over train and test splits and how many episodes you want to use (in cases where the 1k episodes are not necessary).
## Citation
Coming soon. | NathanGavenski/LunarLander-v2 | [
"size_categories:10M<n<100M",
"license:mit",
"Imitation Learning",
"Expert Trajectory",
"region:us"
] | 2024-01-16T11:53:29+00:00 | {"license": "mit", "size_categories": ["10M<n<100M"], "pretty_name": "LunarLander-v2 Expert Dataset", "tags": ["Imitation Learning", "Expert Trajectory"]} | 2024-01-16T11:56:25+00:00 | [] | [] | TAGS
#size_categories-10M<n<100M #license-mit #Imitation Learning #Expert Trajectory #region-us
|
# LunarLander-v2 - Imitation Learning Datasets
This is a dataset created by Imitation Learning Datasets project.
It was created by using Stable Baselines weights from a PPO policy from HuggingFace.
## Description
The dataset consists of 1,000 episodes with an average episodic reward of 500.
Each entry consists of:
## Usage
Feel free to download and use the 'URL' dataset as you please.
If you are interested in using our PyTorch Dataset implementation, feel free to check the IL Datasets project.
There, we implement a base Dataset that downloads this dataset and all other datasets directly from HuggingFace.
The Baseline Dataset also allows for more control over train and test splits and how many episodes you want to use (in cases where the 1k episodes are not necessary).
Coming soon. | [
"# LunarLander-v2 - Imitation Learning Datasets\n\nThis is a dataset created by Imitation Learning Datasets project. \nIt was created by using Stable Baselines weights from a PPO policy from HuggingFace.",
"## Description\n\nThe dataset consists of 1,000 episodes with an average episodic reward of 500.\nEach entry consists of:",
"## Usage\n\nFeel free to download and use the 'URL' dataset as you please.\nIf you are interested in using our PyTorch Dataset implementation, feel free to check the IL Datasets project.\nThere, we implement a base Dataset that downloads this dataset and all other datasets directly from HuggingFace.\nThe Baseline Dataset also allows for more control over train and test splits and how many episodes you want to use (in cases where the 1k episodes are not necessary).\n\nComing soon."
] | [
"TAGS\n#size_categories-10M<n<100M #license-mit #Imitation Learning #Expert Trajectory #region-us \n",
"# LunarLander-v2 - Imitation Learning Datasets\n\nThis is a dataset created by Imitation Learning Datasets project. \nIt was created by using Stable Baselines weights from a PPO policy from HuggingFace.",
"## Description\n\nThe dataset consists of 1,000 episodes with an average episodic reward of 500.\nEach entry consists of:",
"## Usage\n\nFeel free to download and use the 'URL' dataset as you please.\nIf you are interested in using our PyTorch Dataset implementation, feel free to check the IL Datasets project.\nThere, we implement a base Dataset that downloads this dataset and all other datasets directly from HuggingFace.\nThe Baseline Dataset also allows for more control over train and test splits and how many episodes you want to use (in cases where the 1k episodes are not necessary).\n\nComing soon."
] |
76db7eca0d2eea29a50138cd52c51df545f5f372 |
# Dataset Card for Evaluation run of ycros/BagelMIsteryTour-8x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ycros/BagelMIsteryTour-8x7B](https://huggingface.co/ycros/BagelMIsteryTour-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T11:57:17.024146](https://huggingface.co/datasets/open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B/blob/main/results_2024-01-16T11-57-17.024146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7118537836260681,
"acc_stderr": 0.030292814267622557,
"acc_norm": 0.7154538145620108,
"acc_norm_stderr": 0.030880099721365066,
"mc1": 0.602203182374541,
"mc1_stderr": 0.017133934248559676,
"mc2": 0.7494844581449875,
"mc2_stderr": 0.014345730353310387
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.013385021637313572,
"acc_norm": 0.7244027303754266,
"acc_norm_stderr": 0.01305716965576184
},
"harness|hellaswag|10": {
"acc": 0.6919936267675761,
"acc_stderr": 0.004607256752931882,
"acc_norm": 0.8750248954391555,
"acc_norm_stderr": 0.0033001484456091326
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775406,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.034961014811911786,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.034961014811911786
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6403508771929824,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.6403508771929824,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.02573833063941215,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.02573833063941215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8580645161290322,
"acc_stderr": 0.019853003676559747,
"acc_norm": 0.8580645161290322,
"acc_norm_stderr": 0.019853003676559747
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6108374384236454,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.6108374384236454,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983116,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983116
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.015216761819262577,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.015216761819262577
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7153846153846154,
"acc_stderr": 0.0228783227997063,
"acc_norm": 0.7153846153846154,
"acc_norm_stderr": 0.0228783227997063
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8067226890756303,
"acc_stderr": 0.025649470265889183,
"acc_norm": 0.8067226890756303,
"acc_norm_stderr": 0.025649470265889183
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611753,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611753
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.02340553048084631,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.02340553048084631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.0334327006286962,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.0334327006286962
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911899,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911899
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786746,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786746
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.012086705214250428,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.012086705214250428
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4581005586592179,
"acc_stderr": 0.016663683295020527,
"acc_norm": 0.4581005586592179,
"acc_norm_stderr": 0.016663683295020527
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023805186524888146,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023805186524888146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.02255244778047801,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.02255244778047801
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157368,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.029609912075594116,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.029609912075594116
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5273794002607562,
"acc_stderr": 0.012751075788015074,
"acc_norm": 0.5273794002607562,
"acc_norm_stderr": 0.012751075788015074
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.023886881922440345,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.023886881922440345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146634,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.02520696315422539,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.02520696315422539
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.602203182374541,
"mc1_stderr": 0.017133934248559676,
"mc2": 0.7494844581449875,
"mc2_stderr": 0.014345730353310387
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.010796468688068684
},
"harness|gsm8k|5": {
"acc": 0.5981804397270659,
"acc_stderr": 0.013504357787494039
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B | [
"region:us"
] | 2024-01-16T11:59:36+00:00 | {"pretty_name": "Evaluation run of ycros/BagelMIsteryTour-8x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ycros/BagelMIsteryTour-8x7B](https://huggingface.co/ycros/BagelMIsteryTour-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T11:57:17.024146](https://huggingface.co/datasets/open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B/blob/main/results_2024-01-16T11-57-17.024146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7118537836260681,\n \"acc_stderr\": 0.030292814267622557,\n \"acc_norm\": 0.7154538145620108,\n \"acc_norm_stderr\": 0.030880099721365066,\n \"mc1\": 0.602203182374541,\n \"mc1_stderr\": 0.017133934248559676,\n \"mc2\": 0.7494844581449875,\n \"mc2_stderr\": 0.014345730353310387\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6919936267675761,\n \"acc_stderr\": 0.004607256752931882,\n \"acc_norm\": 0.8750248954391555,\n \"acc_norm_stderr\": 0.0033001484456091326\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775406,\n \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.034961014811911786,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.034961014811911786\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.02989614568209546,\n \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.02989614568209546\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.02573833063941215,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.02573833063941215\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8580645161290322,\n \"acc_stderr\": 0.019853003676559747,\n \"acc_norm\": 0.8580645161290322,\n \"acc_norm_stderr\": 0.019853003676559747\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6108374384236454,\n \"acc_stderr\": 0.034304624161038716,\n \"acc_norm\": 0.6108374384236454,\n \"acc_norm_stderr\": 0.034304624161038716\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983116,\n \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983116\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.015216761819262577,\n \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.015216761819262577\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7153846153846154,\n \"acc_stderr\": 0.0228783227997063,\n \"acc_norm\": 0.7153846153846154,\n \"acc_norm_stderr\": 0.0228783227997063\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8067226890756303,\n \"acc_stderr\": 0.025649470265889183,\n \"acc_norm\": 0.8067226890756303,\n \"acc_norm_stderr\": 0.025649470265889183\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611753,\n \"acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611753\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8725490196078431,\n \"acc_stderr\": 0.02340553048084631,\n \"acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.02340553048084631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.0334327006286962,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.0334327006286962\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911899,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911899\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.01987565502786746,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.01987565502786746\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n \"acc_stderr\": 0.012086705214250428,\n \"acc_norm\": 0.8684546615581098,\n \"acc_norm_stderr\": 0.012086705214250428\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4581005586592179,\n \"acc_stderr\": 0.016663683295020527,\n \"acc_norm\": 0.4581005586592179,\n \"acc_norm_stderr\": 0.016663683295020527\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888146,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888146\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n \"acc_stderr\": 0.02255244778047801,\n \"acc_norm\": 0.8038585209003215,\n \"acc_norm_stderr\": 0.02255244778047801\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157368,\n \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157368\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5602836879432624,\n \"acc_stderr\": 0.029609912075594116,\n \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.029609912075594116\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5273794002607562,\n \"acc_stderr\": 0.012751075788015074,\n \"acc_norm\": 0.5273794002607562,\n \"acc_norm_stderr\": 0.012751075788015074\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.023886881922440345,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.023886881922440345\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146634,\n \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146634\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.02520696315422539,\n \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.02520696315422539\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.602203182374541,\n \"mc1_stderr\": 0.017133934248559676,\n \"mc2\": 0.7494844581449875,\n \"mc2_stderr\": 0.014345730353310387\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.010796468688068684\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5981804397270659,\n \"acc_stderr\": 0.013504357787494039\n }\n}\n```", "repo_url": "https://huggingface.co/ycros/BagelMIsteryTour-8x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|arc:challenge|25_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|gsm8k|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hellaswag|10_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T11-57-17.024146.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["**/details_harness|winogrande|5_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T11-57-17.024146.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T11_57_17.024146", "path": ["results_2024-01-16T11-57-17.024146.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T11-57-17.024146.parquet"]}]}]} | 2024-01-16T11:59:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ycros/BagelMIsteryTour-8x7B
Dataset automatically created during the evaluation run of model ycros/BagelMIsteryTour-8x7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T11:57:17.024146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ycros/BagelMIsteryTour-8x7B\n\n\n\nDataset automatically created during the evaluation run of model ycros/BagelMIsteryTour-8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T11:57:17.024146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ycros/BagelMIsteryTour-8x7B\n\n\n\nDataset automatically created during the evaluation run of model ycros/BagelMIsteryTour-8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T11:57:17.024146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
17fc73986f2caa78434ec00fc5db829a0d6ab9e1 |
# Dataset Card for Evaluation run of xDAN2099/xDAN-L2-moe-2x-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xDAN2099/xDAN-L2-moe-2x-v1](https://huggingface.co/xDAN2099/xDAN-L2-moe-2x-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T11:58:02.756350](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1/blob/main/results_2024-01-16T11-58-02.756350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7649759059339861,
"acc_stderr": 0.02802747077552357,
"acc_norm": 0.7678303278344503,
"acc_norm_stderr": 0.02857170363137812,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6176977126106841,
"mc2_stderr": 0.014998426067966347
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205761,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.01357265770308495
},
"harness|hellaswag|10": {
"acc": 0.6736705835490938,
"acc_stderr": 0.004679111783653905,
"acc_norm": 0.8630750846444931,
"acc_norm_stderr": 0.0034306550069275773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.02427022773752271,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.02427022773752271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.023508739218846948,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.023508739218846948
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.026754391348039776,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.026754391348039776
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.03416520447747548,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.03416520447747548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6904761904761905,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.6904761904761905,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.019960225563172885,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.019960225563172885
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527036,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.019880165406588796,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.019880165406588796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.030296771286067326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.030296771286067326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769591,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769591
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.03256850570293648,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.03256850570293648
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640266,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640266
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8295964125560538,
"acc_stderr": 0.025234593447136182,
"acc_norm": 0.8295964125560538,
"acc_norm_stderr": 0.025234593447136182
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8931297709923665,
"acc_stderr": 0.027096548624883733,
"acc_norm": 0.8931297709923665,
"acc_norm_stderr": 0.027096548624883733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563274,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563274
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8895705521472392,
"acc_stderr": 0.024624937788941318,
"acc_norm": 0.8895705521472392,
"acc_norm_stderr": 0.024624937788941318
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.046161430750285455,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.046161430750285455
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640406,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311364,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311364
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.913154533844189,
"acc_stderr": 0.010070298377747776,
"acc_norm": 0.913154533844189,
"acc_norm_stderr": 0.010070298377747776
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135022,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135022
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7318435754189944,
"acc_stderr": 0.014816119635317005,
"acc_norm": 0.7318435754189944,
"acc_norm_stderr": 0.014816119635317005
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.02064559791041877,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.02064559791041877
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.0216700588855108,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.0216700588855108
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.01830386880689179,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.01830386880689179
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.028947338851614098,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.028947338851614098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6160365058670143,
"acc_stderr": 0.01242158783313423,
"acc_norm": 0.6160365058670143,
"acc_norm_stderr": 0.01242158783313423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7977941176470589,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.7977941176470589,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.015588643495370463,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.015588643495370463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585633,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585633
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594194,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594194
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355024,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355024
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6176977126106841,
"mc2_stderr": 0.014998426067966347
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598479
},
"harness|gsm8k|5": {
"acc": 0.7293404094010614,
"acc_stderr": 0.012238245006183411
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1 | [
"region:us"
] | 2024-01-16T12:00:17+00:00 | {"pretty_name": "Evaluation run of xDAN2099/xDAN-L2-moe-2x-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [xDAN2099/xDAN-L2-moe-2x-v1](https://huggingface.co/xDAN2099/xDAN-L2-moe-2x-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T11:58:02.756350](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1/blob/main/results_2024-01-16T11-58-02.756350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7649759059339861,\n \"acc_stderr\": 0.02802747077552357,\n \"acc_norm\": 0.7678303278344503,\n \"acc_norm_stderr\": 0.02857170363137812,\n \"mc1\": 0.46266829865361075,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6176977126106841,\n \"mc2_stderr\": 0.014998426067966347\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205761,\n \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.01357265770308495\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6736705835490938,\n \"acc_stderr\": 0.004679111783653905,\n \"acc_norm\": 0.8630750846444931,\n \"acc_norm_stderr\": 0.0034306550069275773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.02427022773752271,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.02427022773752271\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.023508739218846948,\n \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.023508739218846948\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.026754391348039776,\n \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.026754391348039776\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747548,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6904761904761905,\n \"acc_stderr\": 0.023809523809523864,\n \"acc_norm\": 0.6904761904761905,\n \"acc_norm_stderr\": 0.023809523809523864\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.019960225563172885,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.019960225563172885\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527036,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588796,\n \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.030296771286067326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.030296771286067326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769591,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769591\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293648,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293648\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640266,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640266\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8295964125560538,\n \"acc_stderr\": 0.025234593447136182,\n \"acc_norm\": 0.8295964125560538,\n \"acc_norm_stderr\": 0.025234593447136182\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563274,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563274\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.046161430750285455,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.046161430750285455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640406,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640406\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311364,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311364\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.913154533844189,\n \"acc_stderr\": 0.010070298377747776,\n \"acc_norm\": 0.913154533844189,\n \"acc_norm_stderr\": 0.010070298377747776\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135022,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135022\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7318435754189944,\n \"acc_stderr\": 0.014816119635317005,\n \"acc_norm\": 0.7318435754189944,\n \"acc_norm_stderr\": 0.014816119635317005\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.02064559791041877,\n \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.02064559791041877\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n \"acc_stderr\": 0.0216700588855108,\n \"acc_norm\": 0.8231511254019293,\n \"acc_norm_stderr\": 0.0216700588855108\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.01830386880689179,\n \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.01830386880689179\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614098,\n \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614098\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6160365058670143,\n \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.6160365058670143,\n \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7977941176470589,\n \"acc_stderr\": 0.024398192986654924,\n \"acc_norm\": 0.7977941176470589,\n \"acc_norm_stderr\": 0.024398192986654924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.015588643495370463,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.015588643495370463\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585633,\n \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585633\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594194,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594194\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355024,\n \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355024\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6176977126106841,\n \"mc2_stderr\": 0.014998426067966347\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598479\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7293404094010614,\n \"acc_stderr\": 0.012238245006183411\n }\n}\n```", "repo_url": "https://huggingface.co/xDAN2099/xDAN-L2-moe-2x-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|arc:challenge|25_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|gsm8k|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hellaswag|10_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T11-58-02.756350.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["**/details_harness|winogrande|5_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T11-58-02.756350.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T11_58_02.756350", "path": ["results_2024-01-16T11-58-02.756350.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T11-58-02.756350.parquet"]}]}]} | 2024-01-16T12:00:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of xDAN2099/xDAN-L2-moe-2x-v1
Dataset automatically created during the evaluation run of model xDAN2099/xDAN-L2-moe-2x-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T11:58:02.756350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of xDAN2099/xDAN-L2-moe-2x-v1\n\n\n\nDataset automatically created during the evaluation run of model xDAN2099/xDAN-L2-moe-2x-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T11:58:02.756350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of xDAN2099/xDAN-L2-moe-2x-v1\n\n\n\nDataset automatically created during the evaluation run of model xDAN2099/xDAN-L2-moe-2x-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T11:58:02.756350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d771b0890e44b1f6d30d04e073efdca8cbfc463a | # Dataset Card for "training_v0.1.0-public"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | male-2/training_v0.1.0-public | [
"region:us"
] | 2024-01-16T12:08:04+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "show_num", "dtype": "string"}, {"name": "episode_num", "dtype": "int64"}, {"name": "posted_at", "dtype": "string"}, {"name": "label_category", "dtype": "string"}, {"name": "label_isDialogCentric", "dtype": "bool"}, {"name": "label_isCasual", "dtype": "bool"}, {"name": "label_isUncensored", "dtype": "bool"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 1120, "num_examples": 1}], "download_size": 6934, "dataset_size": 1120}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T12:08:06+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "training_v0.1.0-public"
More Information needed | [
"# Dataset Card for \"training_v0.1.0-public\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"training_v0.1.0-public\"\n\nMore Information needed"
] |
0a6267bfcae110f5c1e6642e0da6ae333d30d332 | # Dataset Card for "vsums_synthetic_wikipedia_seed_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Xapien/vsums_synthetic_wikipedia_seed_10k | [
"region:us"
] | 2024-01-16T12:08:54+00:00 | {"dataset_info": {"features": [{"name": "summary_a", "dtype": "string"}, {"name": "summary_b", "dtype": "string"}, {"name": "likihood_label", "dtype": "float64"}, {"name": "synthetic_labels", "dtype": "bool"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2865956, "num_examples": 9992}], "download_size": 1384858, "dataset_size": 2865956}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T12:36:51+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "vsums_synthetic_wikipedia_seed_10k"
More Information needed | [
"# Dataset Card for \"vsums_synthetic_wikipedia_seed_10k\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"vsums_synthetic_wikipedia_seed_10k\"\n\nMore Information needed"
] |
e7d29dc5b945769da80cf7f2b77db3c300e4a8c0 |
# Dataset Card for Evaluation run of adamo1139/yi-34b-200k-rawrr-dpo-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [adamo1139/yi-34b-200k-rawrr-dpo-1](https://huggingface.co/adamo1139/yi-34b-200k-rawrr-dpo-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T12:24:51.812406](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1/blob/main/results_2024-01-16T12-24-51.812406.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7557329155625617,
"acc_stderr": 0.02836045891045506,
"acc_norm": 0.7606955903500686,
"acc_norm_stderr": 0.02889015510293627,
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5399803687437482,
"mc2_stderr": 0.014956918567738575
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131172,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.013896938461145675
},
"harness|hellaswag|10": {
"acc": 0.6570404301931886,
"acc_stderr": 0.00473727969103619,
"acc_norm": 0.8569010157339175,
"acc_norm_stderr": 0.0034945810763985265
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.02350873921884694,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.02350873921884694
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7702127659574468,
"acc_stderr": 0.02750175294441242,
"acc_norm": 0.7702127659574468,
"acc_norm_stderr": 0.02750175294441242
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7655172413793103,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.7655172413793103,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6402116402116402,
"acc_stderr": 0.024718075944129277,
"acc_norm": 0.6402116402116402,
"acc_norm_stderr": 0.024718075944129277
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.043902592653775635,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.043902592653775635
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103453,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103453
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6699507389162561,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.6699507389162561,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781675,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781675
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909039,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909039
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8128205128205128,
"acc_stderr": 0.019776601086550036,
"acc_norm": 0.8128205128205128,
"acc_norm_stderr": 0.019776601086550036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.029958249250082114,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.029958249250082114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8319327731092437,
"acc_stderr": 0.02428910211569226,
"acc_norm": 0.8319327731092437,
"acc_norm_stderr": 0.02428910211569226
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769574,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8834355828220859,
"acc_stderr": 0.02521232721050711,
"acc_norm": 0.8834355828220859,
"acc_norm_stderr": 0.02521232721050711
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.010648356301876341,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.010648356301876341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.02115267696657528,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.02115267696657528
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6636871508379888,
"acc_stderr": 0.015801003729145887,
"acc_norm": 0.6636871508379888,
"acc_norm_stderr": 0.015801003729145887
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8660130718954249,
"acc_stderr": 0.019504890618464815,
"acc_norm": 0.8660130718954249,
"acc_norm_stderr": 0.019504890618464815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8360128617363344,
"acc_stderr": 0.021029576464662695,
"acc_norm": 0.8360128617363344,
"acc_norm_stderr": 0.021029576464662695
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.01887735383957185,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.01887735383957185
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6063829787234043,
"acc_stderr": 0.029144544781596154,
"acc_norm": 0.6063829787234043,
"acc_norm_stderr": 0.029144544781596154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5971316818774446,
"acc_stderr": 0.012526955577118012,
"acc_norm": 0.5971316818774446,
"acc_norm_stderr": 0.012526955577118012
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8161764705882353,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.8161764705882353,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.02435280072297001,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.02435280072297001
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.0206871869515341,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.0206871869515341
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5399803687437482,
"mc2_stderr": 0.014956918567738575
},
"harness|winogrande|5": {
"acc": 0.8279400157853196,
"acc_stderr": 0.010607731615247008
},
"harness|gsm8k|5": {
"acc": 0.6178923426838514,
"acc_stderr": 0.013384173935648492
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1 | [
"region:us"
] | 2024-01-16T12:27:07+00:00 | {"pretty_name": "Evaluation run of adamo1139/yi-34b-200k-rawrr-dpo-1", "dataset_summary": "Dataset automatically created during the evaluation run of model [adamo1139/yi-34b-200k-rawrr-dpo-1](https://huggingface.co/adamo1139/yi-34b-200k-rawrr-dpo-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T12:24:51.812406](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1/blob/main/results_2024-01-16T12-24-51.812406.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7557329155625617,\n \"acc_stderr\": 0.02836045891045506,\n \"acc_norm\": 0.7606955903500686,\n \"acc_norm_stderr\": 0.02889015510293627,\n \"mc1\": 0.3929008567931457,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5399803687437482,\n \"mc2_stderr\": 0.014956918567738575\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131172,\n \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.013896938461145675\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6570404301931886,\n \"acc_stderr\": 0.00473727969103619,\n \"acc_norm\": 0.8569010157339175,\n \"acc_norm_stderr\": 0.0034945810763985265\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.02350873921884694,\n \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.02350873921884694\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7702127659574468,\n \"acc_stderr\": 0.02750175294441242,\n \"acc_norm\": 0.7702127659574468,\n \"acc_norm_stderr\": 0.02750175294441242\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7655172413793103,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.7655172413793103,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6402116402116402,\n \"acc_stderr\": 0.024718075944129277,\n \"acc_norm\": 0.6402116402116402,\n \"acc_norm_stderr\": 0.024718075944129277\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5952380952380952,\n \"acc_stderr\": 0.043902592653775635,\n \"acc_norm\": 0.5952380952380952,\n \"acc_norm_stderr\": 0.043902592653775635\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.01730838128103453,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.01730838128103453\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6699507389162561,\n \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.6699507389162561,\n \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781675,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781675\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909039,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909039\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8128205128205128,\n \"acc_stderr\": 0.019776601086550036,\n \"acc_norm\": 0.8128205128205128,\n \"acc_norm_stderr\": 0.019776601086550036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.029958249250082114,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.029958249250082114\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8319327731092437,\n \"acc_stderr\": 0.02428910211569226,\n \"acc_norm\": 0.8319327731092437,\n \"acc_norm_stderr\": 0.02428910211569226\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769574,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769574\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8834355828220859,\n \"acc_stderr\": 0.02521232721050711,\n \"acc_norm\": 0.8834355828220859,\n \"acc_norm_stderr\": 0.02521232721050711\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n \"acc_stderr\": 0.010648356301876341,\n \"acc_norm\": 0.9016602809706258,\n \"acc_norm_stderr\": 0.010648356301876341\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.02115267696657528,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.02115267696657528\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6636871508379888,\n \"acc_stderr\": 0.015801003729145887,\n \"acc_norm\": 0.6636871508379888,\n \"acc_norm_stderr\": 0.015801003729145887\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8660130718954249,\n \"acc_stderr\": 0.019504890618464815,\n \"acc_norm\": 0.8660130718954249,\n \"acc_norm_stderr\": 0.019504890618464815\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n \"acc_stderr\": 0.021029576464662695,\n \"acc_norm\": 0.8360128617363344,\n \"acc_norm_stderr\": 0.021029576464662695\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.01887735383957185,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.01887735383957185\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6063829787234043,\n \"acc_stderr\": 0.029144544781596154,\n \"acc_norm\": 0.6063829787234043,\n \"acc_norm_stderr\": 0.029144544781596154\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5971316818774446,\n \"acc_stderr\": 0.012526955577118012,\n \"acc_norm\": 0.5971316818774446,\n \"acc_norm_stderr\": 0.012526955577118012\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8161764705882353,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.8161764705882353,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.02435280072297001,\n \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.02435280072297001\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n \"acc_stderr\": 0.0206871869515341,\n \"acc_norm\": 0.9054726368159204,\n \"acc_norm_stderr\": 0.0206871869515341\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5399803687437482,\n \"mc2_stderr\": 0.014956918567738575\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247008\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6178923426838514,\n \"acc_stderr\": 0.013384173935648492\n }\n}\n```", "repo_url": "https://huggingface.co/adamo1139/yi-34b-200k-rawrr-dpo-1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|arc:challenge|25_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|gsm8k|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hellaswag|10_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T12-24-51.812406.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["**/details_harness|winogrande|5_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T12-24-51.812406.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T12_24_51.812406", "path": ["results_2024-01-16T12-24-51.812406.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T12-24-51.812406.parquet"]}]}]} | 2024-01-16T12:27:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of adamo1139/yi-34b-200k-rawrr-dpo-1
Dataset automatically created during the evaluation run of model adamo1139/yi-34b-200k-rawrr-dpo-1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T12:24:51.812406(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of adamo1139/yi-34b-200k-rawrr-dpo-1\n\n\n\nDataset automatically created during the evaluation run of model adamo1139/yi-34b-200k-rawrr-dpo-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T12:24:51.812406(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of adamo1139/yi-34b-200k-rawrr-dpo-1\n\n\n\nDataset automatically created during the evaluation run of model adamo1139/yi-34b-200k-rawrr-dpo-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T12:24:51.812406(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
68539cb038623b57d13a3eb0dd7c099d667895ef |
# Synthetic Search Filters
This is generated with GPT-4 Turbo possible search filters and theirs representations for the given business/service categories:
```
Educational Institutions, Job Recruitment Agencies, Banking Services, Investment Services, Insurance Services, Financial Planning and Advisory, Credit Services, Payment Processing, Mortgage and Real Estate Services, Taxation Services, Risk Management and Compliance, Digital and Mobile Banking, Retail Stores (Online and Offline), Automotive Dealerships, Restaurants and Food Delivery Services, Entertainment and Media Platforms, Government Services, Travelers and Consumers, Logistics and Supply Chain Management, Customer Support Services, Market Research Firms, Mobile App Development, Game Development, Cloud Computing Services, Data Analytics and Business Intelligence, Cybersecurity Software, User Interface/User Experience Design, Internet of Things (IoT) Development, Project Management Tools, Version Control Systems, Continuous Integration/Continuous Deployment, Issue Tracking and Bug Reporting, Collaborative Development Environments, Team Communication and Chat Tools, Task and Time Management, Customer Support and Feedback, Cloud-based Development Environments, Image Stock Platforms, Video Hosting and Portals, Social Networks, Professional Social Networks, Dating Apps, Telecommunication Companies, Legal Services Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing
```
This is a parsed in the way each row is an unique pair filter - represantation version of [`EmbeddingStudio/synthetic-search-filters-raw`](https://huggingface.co/datasets/EmbeddingStudio/synthetic-search-filters-raw).
## Columns description
* category (type: Optional[str]) - business/service category name.
* category_description (type: Optional[str]) - longer description of business/service.
* filter_name (type: Optional[str]) - meaningful name of filter.
* representation_name (type: Optional[str]) - name of filter representation.
* representation_type (type: Optional[str]) - python-like type of representation value (str, int, float, bool)
* representation_enum (type: (Optional[List[str]])) - is represntation is an enumertation, this is a list of possible values.
* representation_examples (type: List[Union[str, int, float]])) - exmaples of expected representation values.
* representation_pattern (type: Optional[str]) - if representation is a pattern-like (e.g. `dd/mm/YYYY`), this is a pattern to follow.
## What are representations?
It's easier to understand with an exmaple. Imagine, you have a filter named `Rating`, so it can be represented as:
* Integer or float value in 1-5 scale
* Integer or float value in 1-10 scale
* Integer or float value in 1-100 scale
* As the enumeration with values (*, **, ***, ****, *****)
* As the enumeration with values (bad, medium, good, the best)
## Train / test splitting principles
As we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:
* Ability to work well with unseen domain
* Ability to work well with unseen filters
* Ability to work well with unseen queries
For these purposes we:
1. We put into test split 5 categories, completely separared from train: Telecommunication Companies, Legal Services, Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing.
2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.
# How to use it
```python
from datasets import load_dataset
filters_dataset = load_dataset("EmbeddingStudio/synthetic-search-filters")
```
Embedding Studio team uses this filters to [generate queries and theirs parsed version](EmbeddingStudio/query-parsing-instructions-falcon) for [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct) [fine-tuning to follow Zero-Shot search queries parsing instructions](https://huggingface.co/EmbeddingStudio/query-parser-falcon-7b-instruct).
| EmbeddingStudio/synthetic-search-filters | [
"task_categories:token-classification",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"synthetic",
"search-queries",
"e-commerce",
"online-shops",
"travel-agencies",
"educational-institutions-ai",
"job-recruitment-automation",
"banking-digital-services",
"investment-ai-analysis",
"insurance-tech-innovation",
"financial-advisory-ai",
"credit-services-automation",
"payment-processing-tech",
"mortgage-tech-solutions",
"real-estate-digital-solutions",
"taxation-tech-services",
"risk-management-ai",
"compliance-automation",
"digital-banking-innovation",
"mobile-banking-tech",
"online-retail-tech",
"offline-retail-automation",
"automotive-dealership-tech",
"restaurant-automation-tech",
"food-delivery-ai",
"entertainment-platforms-ai",
"media-platforms-tech",
"government-services-automation",
"travel-tech-innovation",
"consumer-analytics-ai",
"logistics-tech-automation",
"supply-chain-ai",
"customer-support-tech",
"market-research-ai",
"mobile-app-dev-tech",
"game-dev-ai",
"cloud-computing-services",
"data-analytics-ai",
"business-intelligence-ai",
"cybersecurity-software-tech",
"ui-ux-design-ai",
"iot-development-tech",
"project-management-tools-ai",
"version-control-systems-tech",
"ci-cd-automation",
"issue-tracking-ai",
"bug-reporting-automation",
"collaborative-dev-environments",
"team-communication-tech",
"task-time-management-ai",
"customer-feedback-ai",
"cloud-based-dev-tech",
"image-stock-platforms-ai",
"video-hosting-tech",
"social-networks-ai",
"professional-social-networks-ai",
"dating-apps-tech",
"region:us"
] | 2024-01-16T12:43:14+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["token-classification", "text-generation"], "pretty_name": "Synthetic Search Filters", "dataset_info": {"features": [{"name": "category", "dtype": "string"}, {"name": "category_description", "dtype": "string"}, {"name": "filter_name", "dtype": "string"}, {"name": "representation_name", "dtype": "string"}, {"name": "representation_type", "dtype": "string"}, {"name": "representation_enum", "sequence": "string"}, {"name": "representation_examples", "sequence": "string"}, {"name": "representation_pattern", "dtype": "string"}], "splits": [{"name": "train_filters", "num_bytes": 411999, "num_examples": 1725}, {"name": "test_filters", "num_bytes": 512983, "num_examples": 2164}], "download_size": 128534, "dataset_size": 924982}, "configs": [{"config_name": "default", "data_files": [{"split": "train_filters", "path": "data/train_filters-*"}, {"split": "test_filters", "path": "data/test_filters-*"}]}], "tags": ["synthetic", "search-queries", "e-commerce", "online-shops", "travel-agencies", "educational-institutions-ai", "job-recruitment-automation", "banking-digital-services", "investment-ai-analysis", "insurance-tech-innovation", "financial-advisory-ai", "credit-services-automation", "payment-processing-tech", "mortgage-tech-solutions", "real-estate-digital-solutions", "taxation-tech-services", "risk-management-ai", "compliance-automation", "digital-banking-innovation", "mobile-banking-tech", "online-retail-tech", "offline-retail-automation", "automotive-dealership-tech", "restaurant-automation-tech", "food-delivery-ai", "entertainment-platforms-ai", "media-platforms-tech", "government-services-automation", "travel-tech-innovation", "consumer-analytics-ai", "logistics-tech-automation", "supply-chain-ai", "customer-support-tech", "market-research-ai", "mobile-app-dev-tech", "game-dev-ai", "cloud-computing-services", "data-analytics-ai", "business-intelligence-ai", "cybersecurity-software-tech", "ui-ux-design-ai", "iot-development-tech", "project-management-tools-ai", "version-control-systems-tech", "ci-cd-automation", "issue-tracking-ai", "bug-reporting-automation", "collaborative-dev-environments", "team-communication-tech", "task-time-management-ai", "customer-feedback-ai", "cloud-based-dev-tech", "image-stock-platforms-ai", "video-hosting-tech", "social-networks-ai", "professional-social-networks-ai", "dating-apps-tech"]} | 2024-02-02T10:45:43+00:00 | [] | [
"en"
] | TAGS
#task_categories-token-classification #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-apache-2.0 #synthetic #search-queries #e-commerce #online-shops #travel-agencies #educational-institutions-ai #job-recruitment-automation #banking-digital-services #investment-ai-analysis #insurance-tech-innovation #financial-advisory-ai #credit-services-automation #payment-processing-tech #mortgage-tech-solutions #real-estate-digital-solutions #taxation-tech-services #risk-management-ai #compliance-automation #digital-banking-innovation #mobile-banking-tech #online-retail-tech #offline-retail-automation #automotive-dealership-tech #restaurant-automation-tech #food-delivery-ai #entertainment-platforms-ai #media-platforms-tech #government-services-automation #travel-tech-innovation #consumer-analytics-ai #logistics-tech-automation #supply-chain-ai #customer-support-tech #market-research-ai #mobile-app-dev-tech #game-dev-ai #cloud-computing-services #data-analytics-ai #business-intelligence-ai #cybersecurity-software-tech #ui-ux-design-ai #iot-development-tech #project-management-tools-ai #version-control-systems-tech #ci-cd-automation #issue-tracking-ai #bug-reporting-automation #collaborative-dev-environments #team-communication-tech #task-time-management-ai #customer-feedback-ai #cloud-based-dev-tech #image-stock-platforms-ai #video-hosting-tech #social-networks-ai #professional-social-networks-ai #dating-apps-tech #region-us
|
# Synthetic Search Filters
This is generated with GPT-4 Turbo possible search filters and theirs representations for the given business/service categories:
This is a parsed in the way each row is an unique pair filter - represantation version of 'EmbeddingStudio/synthetic-search-filters-raw'.
## Columns description
* category (type: Optional[str]) - business/service category name.
* category_description (type: Optional[str]) - longer description of business/service.
* filter_name (type: Optional[str]) - meaningful name of filter.
* representation_name (type: Optional[str]) - name of filter representation.
* representation_type (type: Optional[str]) - python-like type of representation value (str, int, float, bool)
* representation_enum (type: (Optional[List[str]])) - is represntation is an enumertation, this is a list of possible values.
* representation_examples (type: List[Union[str, int, float]])) - exmaples of expected representation values.
* representation_pattern (type: Optional[str]) - if representation is a pattern-like (e.g. 'dd/mm/YYYY'), this is a pattern to follow.
## What are representations?
It's easier to understand with an exmaple. Imagine, you have a filter named 'Rating', so it can be represented as:
* Integer or float value in 1-5 scale
* Integer or float value in 1-10 scale
* Integer or float value in 1-100 scale
* As the enumeration with values (*, , *, , *)
* As the enumeration with values (bad, medium, good, the best)
## Train / test splitting principles
As we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:
* Ability to work well with unseen domain
* Ability to work well with unseen filters
* Ability to work well with unseen queries
For these purposes we:
1. We put into test split 5 categories, completely separared from train: Telecommunication Companies, Legal Services, Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing.
2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.
# How to use it
Embedding Studio team uses this filters to generate queries and theirs parsed version for Falcon-7B-Instruct fine-tuning to follow Zero-Shot search queries parsing instructions.
| [
"# Synthetic Search Filters\n\nThis is generated with GPT-4 Turbo possible search filters and theirs representations for the given business/service categories:\n\n\nThis is a parsed in the way each row is an unique pair filter - represantation version of 'EmbeddingStudio/synthetic-search-filters-raw'.",
"## Columns description\n\n* category (type: Optional[str]) - business/service category name.\n* category_description (type: Optional[str]) - longer description of business/service.\n* filter_name (type: Optional[str]) - meaningful name of filter.\n* representation_name (type: Optional[str]) - name of filter representation.\n* representation_type (type: Optional[str]) - python-like type of representation value (str, int, float, bool)\n* representation_enum (type: (Optional[List[str]])) - is represntation is an enumertation, this is a list of possible values.\n* representation_examples (type: List[Union[str, int, float]])) - exmaples of expected representation values.\n* representation_pattern (type: Optional[str]) - if representation is a pattern-like (e.g. 'dd/mm/YYYY'), this is a pattern to follow.",
"## What are representations?\n\nIt's easier to understand with an exmaple. Imagine, you have a filter named 'Rating', so it can be represented as:\n* Integer or float value in 1-5 scale\n* Integer or float value in 1-10 scale\n* Integer or float value in 1-100 scale\n* As the enumeration with values (*, , *, , *)\n* As the enumeration with values (bad, medium, good, the best)",
"## Train / test splitting principles\n\nAs we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:\n\n* Ability to work well with unseen domain\n* Ability to work well with unseen filters\n* Ability to work well with unseen queries\n\nFor these purposes we:\n\n1. We put into test split 5 categories, completely separared from train: Telecommunication Companies, Legal Services, Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing.\n2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.",
"# How to use it\n\n\nEmbedding Studio team uses this filters to generate queries and theirs parsed version for Falcon-7B-Instruct fine-tuning to follow Zero-Shot search queries parsing instructions."
] | [
"TAGS\n#task_categories-token-classification #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-apache-2.0 #synthetic #search-queries #e-commerce #online-shops #travel-agencies #educational-institutions-ai #job-recruitment-automation #banking-digital-services #investment-ai-analysis #insurance-tech-innovation #financial-advisory-ai #credit-services-automation #payment-processing-tech #mortgage-tech-solutions #real-estate-digital-solutions #taxation-tech-services #risk-management-ai #compliance-automation #digital-banking-innovation #mobile-banking-tech #online-retail-tech #offline-retail-automation #automotive-dealership-tech #restaurant-automation-tech #food-delivery-ai #entertainment-platforms-ai #media-platforms-tech #government-services-automation #travel-tech-innovation #consumer-analytics-ai #logistics-tech-automation #supply-chain-ai #customer-support-tech #market-research-ai #mobile-app-dev-tech #game-dev-ai #cloud-computing-services #data-analytics-ai #business-intelligence-ai #cybersecurity-software-tech #ui-ux-design-ai #iot-development-tech #project-management-tools-ai #version-control-systems-tech #ci-cd-automation #issue-tracking-ai #bug-reporting-automation #collaborative-dev-environments #team-communication-tech #task-time-management-ai #customer-feedback-ai #cloud-based-dev-tech #image-stock-platforms-ai #video-hosting-tech #social-networks-ai #professional-social-networks-ai #dating-apps-tech #region-us \n",
"# Synthetic Search Filters\n\nThis is generated with GPT-4 Turbo possible search filters and theirs representations for the given business/service categories:\n\n\nThis is a parsed in the way each row is an unique pair filter - represantation version of 'EmbeddingStudio/synthetic-search-filters-raw'.",
"## Columns description\n\n* category (type: Optional[str]) - business/service category name.\n* category_description (type: Optional[str]) - longer description of business/service.\n* filter_name (type: Optional[str]) - meaningful name of filter.\n* representation_name (type: Optional[str]) - name of filter representation.\n* representation_type (type: Optional[str]) - python-like type of representation value (str, int, float, bool)\n* representation_enum (type: (Optional[List[str]])) - is represntation is an enumertation, this is a list of possible values.\n* representation_examples (type: List[Union[str, int, float]])) - exmaples of expected representation values.\n* representation_pattern (type: Optional[str]) - if representation is a pattern-like (e.g. 'dd/mm/YYYY'), this is a pattern to follow.",
"## What are representations?\n\nIt's easier to understand with an exmaple. Imagine, you have a filter named 'Rating', so it can be represented as:\n* Integer or float value in 1-5 scale\n* Integer or float value in 1-10 scale\n* Integer or float value in 1-100 scale\n* As the enumeration with values (*, , *, , *)\n* As the enumeration with values (bad, medium, good, the best)",
"## Train / test splitting principles\n\nAs we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:\n\n* Ability to work well with unseen domain\n* Ability to work well with unseen filters\n* Ability to work well with unseen queries\n\nFor these purposes we:\n\n1. We put into test split 5 categories, completely separared from train: Telecommunication Companies, Legal Services, Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing.\n2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.",
"# How to use it\n\n\nEmbedding Studio team uses this filters to generate queries and theirs parsed version for Falcon-7B-Instruct fine-tuning to follow Zero-Shot search queries parsing instructions."
] |
c324f81200a062c562932cde63de0f09fc6e0d6c |
# Synthetic Search Filters Raw
This is the raw version of [EmbeddingStudio/synthetic-search-filters dataset](https://huggingface.co/datasets/EmbeddingStudio/synthetic-search-filters).
This is generated with GPT-4 Turbo possible search filters and theirs representations for the given business/service categories:
```
Educational Institutions, Job Recruitment Agencies, Banking Services, Investment Services, Insurance Services, Financial Planning and Advisory, Credit Services, Payment Processing, Mortgage and Real Estate Services, Taxation Services, Risk Management and Compliance, Digital and Mobile Banking, Retail Stores (Online and Offline), Automotive Dealerships, Restaurants and Food Delivery Services, Entertainment and Media Platforms, Government Services, Travelers and Consumers, Logistics and Supply Chain Management, Customer Support Services, Market Research Firms, Mobile App Development, Game Development, Cloud Computing Services, Data Analytics and Business Intelligence, Cybersecurity Software, User Interface/User Experience Design, Internet of Things (IoT) Development, Project Management Tools, Version Control Systems, Continuous Integration/Continuous Deployment, Issue Tracking and Bug Reporting, Collaborative Development Environments, Team Communication and Chat Tools, Task and Time Management, Customer Support and Feedback, Cloud-based Development Environments, Image Stock Platforms, Video Hosting and Portals, Social Networks, Professional Social Networks, Dating Apps, Telecommunication Companies, Legal Services Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing
```
## Columns description
* category (type: str) - name of a business / service.
* filters (type: str) - JSON parsable filters schema
Filters schema is JSON-readable line in the format (we highly recommend you to use it):
List of filters (dict):
* Name - name of filter (better to be meaningful).
* Representations - list of possible filter formats (dict):
* Name - name of representation (better to be meaningful).
* Type - python base type (int, float, str, bool).
* Examples - list of examples.
* Enum - if a representation is enumeration, provide a list of possible values, LLM should map parsed value into this list.
* Pattern - if a representation is pattern-like (datetime, regexp, etc.) provide a pattern text in any format.
## What are representations?
It's easier to understand with an exmaple. Imagine, you have a filter named `Rating`, so it can be represented as:
* Integer or float value in 1-5 scale
* Integer or float value in 1-10 scale
* Integer or float value in 1-100 scale
* As the enumeration with values (*, **, ***, ****, *****)
* As the enumeration with values (bad, medium, good, the best)
## Train / test splitting principles
As we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:
* Ability to work well with unseen domain
* Ability to work well with unseen filters
* Ability to work well with unseen queries
For these purposes we:
1. We put into test split 5 categories, completely separared from train: Telecommunication Companies, Legal Services, Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing.
2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.
# How to use it
```python
import json
from datasets import load_dataset
filters_dataset = load_dataset("EmbeddingStudio/synthetic-search-filters-ru-raw")
train_filters_schema = dict()
for row in filters_dataset['train_filters_raw']:
train_filters_schema[json.loads(row['Category'])['category']] = json.loads(row['Filters'])
test_filters_schema = dict()
for row in filters_dataset['test_filters_raw']:
test_filters_schema[json.loads(row['Category'])['category']] = json.loads(row['Filters'])
```
Embedding Studio team uses this filters to [generate queries and theirs parsed version](EmbeddingStudio/query-parsing-instructions-falcon) for [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct) [fine-tuning to follow Zero-Shot search queries parsing instructions](https://huggingface.co/EmbeddingStudio/query-parser-falcon-7b-instruct).
| EmbeddingStudio/synthetic-search-filters-raw | [
"task_categories:token-classification",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"synthetic",
"search-queries",
"e-commerce",
"online-shops",
"travel-agencies",
"educational-institutions-ai",
"job-recruitment-automation",
"banking-digital-services",
"investment-ai-analysis",
"insurance-tech-innovation",
"financial-advisory-ai",
"credit-services-automation",
"payment-processing-tech",
"mortgage-tech-solutions",
"real-estate-digital-solutions",
"taxation-tech-services",
"risk-management-ai",
"compliance-automation",
"digital-banking-innovation",
"mobile-banking-tech",
"online-retail-tech",
"offline-retail-automation",
"automotive-dealership-tech",
"restaurant-automation-tech",
"food-delivery-ai",
"entertainment-platforms-ai",
"media-platforms-tech",
"government-services-automation",
"travel-tech-innovation",
"consumer-analytics-ai",
"logistics-tech-automation",
"supply-chain-ai",
"customer-support-tech",
"market-research-ai",
"mobile-app-dev-tech",
"game-dev-ai",
"cloud-computing-services",
"data-analytics-ai",
"business-intelligence-ai",
"cybersecurity-software-tech",
"ui-ux-design-ai",
"iot-development-tech",
"project-management-tools-ai",
"version-control-systems-tech",
"ci-cd-automation",
"issue-tracking-ai",
"bug-reporting-automation",
"collaborative-dev-environments",
"team-communication-tech",
"task-time-management-ai",
"customer-feedback-ai",
"cloud-based-dev-tech",
"image-stock-platforms-ai",
"video-hosting-tech",
"social-networks-ai",
"professional-social-networks-ai",
"dating-apps-tech",
"region:us"
] | 2024-01-16T12:46:17+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["token-classification", "text-generation"], "pretty_name": "Synthetic Search Filters Raw", "dataset_info": {"features": [{"name": "category", "dtype": "string"}, {"name": "category_description", "dtype": "string"}, {"name": "filter_name", "dtype": "string"}, {"name": "representation_name", "dtype": "string"}, {"name": "representation_type", "dtype": "string"}, {"name": "representation_enum", "sequence": "string"}, {"name": "representation_examples", "sequence": "string"}, {"name": "representation_pattern", "dtype": "string"}], "splits": [{"name": "train_filters", "num_bytes": 411999, "num_examples": 1725}, {"name": "test_filters", "num_bytes": 512983, "num_examples": 2164}], "download_size": 128534, "dataset_size": 924982}, "configs": [{"config_name": "default", "data_files": [{"split": "train_filters", "path": "data/train_filters-*"}, {"split": "test_filters", "path": "data/test_filters-*"}]}], "tags": ["synthetic", "search-queries", "e-commerce", "online-shops", "travel-agencies", "educational-institutions-ai", "job-recruitment-automation", "banking-digital-services", "investment-ai-analysis", "insurance-tech-innovation", "financial-advisory-ai", "credit-services-automation", "payment-processing-tech", "mortgage-tech-solutions", "real-estate-digital-solutions", "taxation-tech-services", "risk-management-ai", "compliance-automation", "digital-banking-innovation", "mobile-banking-tech", "online-retail-tech", "offline-retail-automation", "automotive-dealership-tech", "restaurant-automation-tech", "food-delivery-ai", "entertainment-platforms-ai", "media-platforms-tech", "government-services-automation", "travel-tech-innovation", "consumer-analytics-ai", "logistics-tech-automation", "supply-chain-ai", "customer-support-tech", "market-research-ai", "mobile-app-dev-tech", "game-dev-ai", "cloud-computing-services", "data-analytics-ai", "business-intelligence-ai", "cybersecurity-software-tech", "ui-ux-design-ai", "iot-development-tech", "project-management-tools-ai", "version-control-systems-tech", "ci-cd-automation", "issue-tracking-ai", "bug-reporting-automation", "collaborative-dev-environments", "team-communication-tech", "task-time-management-ai", "customer-feedback-ai", "cloud-based-dev-tech", "image-stock-platforms-ai", "video-hosting-tech", "social-networks-ai", "professional-social-networks-ai", "dating-apps-tech"]} | 2024-02-02T10:25:08+00:00 | [] | [
"en"
] | TAGS
#task_categories-token-classification #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-apache-2.0 #synthetic #search-queries #e-commerce #online-shops #travel-agencies #educational-institutions-ai #job-recruitment-automation #banking-digital-services #investment-ai-analysis #insurance-tech-innovation #financial-advisory-ai #credit-services-automation #payment-processing-tech #mortgage-tech-solutions #real-estate-digital-solutions #taxation-tech-services #risk-management-ai #compliance-automation #digital-banking-innovation #mobile-banking-tech #online-retail-tech #offline-retail-automation #automotive-dealership-tech #restaurant-automation-tech #food-delivery-ai #entertainment-platforms-ai #media-platforms-tech #government-services-automation #travel-tech-innovation #consumer-analytics-ai #logistics-tech-automation #supply-chain-ai #customer-support-tech #market-research-ai #mobile-app-dev-tech #game-dev-ai #cloud-computing-services #data-analytics-ai #business-intelligence-ai #cybersecurity-software-tech #ui-ux-design-ai #iot-development-tech #project-management-tools-ai #version-control-systems-tech #ci-cd-automation #issue-tracking-ai #bug-reporting-automation #collaborative-dev-environments #team-communication-tech #task-time-management-ai #customer-feedback-ai #cloud-based-dev-tech #image-stock-platforms-ai #video-hosting-tech #social-networks-ai #professional-social-networks-ai #dating-apps-tech #region-us
|
# Synthetic Search Filters Raw
This is the raw version of EmbeddingStudio/synthetic-search-filters dataset.
This is generated with GPT-4 Turbo possible search filters and theirs representations for the given business/service categories:
## Columns description
* category (type: str) - name of a business / service.
* filters (type: str) - JSON parsable filters schema
Filters schema is JSON-readable line in the format (we highly recommend you to use it):
List of filters (dict):
* Name - name of filter (better to be meaningful).
* Representations - list of possible filter formats (dict):
* Name - name of representation (better to be meaningful).
* Type - python base type (int, float, str, bool).
* Examples - list of examples.
* Enum - if a representation is enumeration, provide a list of possible values, LLM should map parsed value into this list.
* Pattern - if a representation is pattern-like (datetime, regexp, etc.) provide a pattern text in any format.
## What are representations?
It's easier to understand with an exmaple. Imagine, you have a filter named 'Rating', so it can be represented as:
* Integer or float value in 1-5 scale
* Integer or float value in 1-10 scale
* Integer or float value in 1-100 scale
* As the enumeration with values (*, , *, , *)
* As the enumeration with values (bad, medium, good, the best)
## Train / test splitting principles
As we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:
* Ability to work well with unseen domain
* Ability to work well with unseen filters
* Ability to work well with unseen queries
For these purposes we:
1. We put into test split 5 categories, completely separared from train: Telecommunication Companies, Legal Services, Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing.
2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.
# How to use it
Embedding Studio team uses this filters to generate queries and theirs parsed version for Falcon-7B-Instruct fine-tuning to follow Zero-Shot search queries parsing instructions.
| [
"# Synthetic Search Filters Raw\n\nThis is the raw version of EmbeddingStudio/synthetic-search-filters dataset.\n\nThis is generated with GPT-4 Turbo possible search filters and theirs representations for the given business/service categories:",
"## Columns description\n\n* category (type: str) - name of a business / service.\n* filters (type: str) - JSON parsable filters schema\n\nFilters schema is JSON-readable line in the format (we highly recommend you to use it): \nList of filters (dict):\n\n* Name - name of filter (better to be meaningful).\n* Representations - list of possible filter formats (dict):\n * Name - name of representation (better to be meaningful).\n * Type - python base type (int, float, str, bool).\n * Examples - list of examples.\n * Enum - if a representation is enumeration, provide a list of possible values, LLM should map parsed value into this list.\n * Pattern - if a representation is pattern-like (datetime, regexp, etc.) provide a pattern text in any format.",
"## What are representations?\n\nIt's easier to understand with an exmaple. Imagine, you have a filter named 'Rating', so it can be represented as:\n* Integer or float value in 1-5 scale\n* Integer or float value in 1-10 scale\n* Integer or float value in 1-100 scale\n* As the enumeration with values (*, , *, , *)\n* As the enumeration with values (bad, medium, good, the best)",
"## Train / test splitting principles\n\nAs we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:\n\n* Ability to work well with unseen domain\n* Ability to work well with unseen filters\n* Ability to work well with unseen queries\n\nFor these purposes we:\n\n1. We put into test split 5 categories, completely separared from train: Telecommunication Companies, Legal Services, Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing.\n2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.",
"# How to use it\n\n\nEmbedding Studio team uses this filters to generate queries and theirs parsed version for Falcon-7B-Instruct fine-tuning to follow Zero-Shot search queries parsing instructions."
] | [
"TAGS\n#task_categories-token-classification #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-apache-2.0 #synthetic #search-queries #e-commerce #online-shops #travel-agencies #educational-institutions-ai #job-recruitment-automation #banking-digital-services #investment-ai-analysis #insurance-tech-innovation #financial-advisory-ai #credit-services-automation #payment-processing-tech #mortgage-tech-solutions #real-estate-digital-solutions #taxation-tech-services #risk-management-ai #compliance-automation #digital-banking-innovation #mobile-banking-tech #online-retail-tech #offline-retail-automation #automotive-dealership-tech #restaurant-automation-tech #food-delivery-ai #entertainment-platforms-ai #media-platforms-tech #government-services-automation #travel-tech-innovation #consumer-analytics-ai #logistics-tech-automation #supply-chain-ai #customer-support-tech #market-research-ai #mobile-app-dev-tech #game-dev-ai #cloud-computing-services #data-analytics-ai #business-intelligence-ai #cybersecurity-software-tech #ui-ux-design-ai #iot-development-tech #project-management-tools-ai #version-control-systems-tech #ci-cd-automation #issue-tracking-ai #bug-reporting-automation #collaborative-dev-environments #team-communication-tech #task-time-management-ai #customer-feedback-ai #cloud-based-dev-tech #image-stock-platforms-ai #video-hosting-tech #social-networks-ai #professional-social-networks-ai #dating-apps-tech #region-us \n",
"# Synthetic Search Filters Raw\n\nThis is the raw version of EmbeddingStudio/synthetic-search-filters dataset.\n\nThis is generated with GPT-4 Turbo possible search filters and theirs representations for the given business/service categories:",
"## Columns description\n\n* category (type: str) - name of a business / service.\n* filters (type: str) - JSON parsable filters schema\n\nFilters schema is JSON-readable line in the format (we highly recommend you to use it): \nList of filters (dict):\n\n* Name - name of filter (better to be meaningful).\n* Representations - list of possible filter formats (dict):\n * Name - name of representation (better to be meaningful).\n * Type - python base type (int, float, str, bool).\n * Examples - list of examples.\n * Enum - if a representation is enumeration, provide a list of possible values, LLM should map parsed value into this list.\n * Pattern - if a representation is pattern-like (datetime, regexp, etc.) provide a pattern text in any format.",
"## What are representations?\n\nIt's easier to understand with an exmaple. Imagine, you have a filter named 'Rating', so it can be represented as:\n* Integer or float value in 1-5 scale\n* Integer or float value in 1-10 scale\n* Integer or float value in 1-100 scale\n* As the enumeration with values (*, , *, , *)\n* As the enumeration with values (bad, medium, good, the best)",
"## Train / test splitting principles\n\nAs we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:\n\n* Ability to work well with unseen domain\n* Ability to work well with unseen filters\n* Ability to work well with unseen queries\n\nFor these purposes we:\n\n1. We put into test split 5 categories, completely separared from train: Telecommunication Companies, Legal Services, Enterprise Software Development, Artificial Intelligence and Machine Learning, Documentation and Knowledge Sharing.\n2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.",
"# How to use it\n\n\nEmbedding Studio team uses this filters to generate queries and theirs parsed version for Falcon-7B-Instruct fine-tuning to follow Zero-Shot search queries parsing instructions."
] |
a8314dbe8632be0ba663ef7917ffb9121dd2f399 |
# Dataset Card for Evaluation run of Weyaxi/Astralis-4x34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Astralis-4x34B](https://huggingface.co/Weyaxi/Astralis-4x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Astralis-4x34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T12:46:01.846891](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Astralis-4x34B/blob/main/results_2024-01-16T12-46-01.846891.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7689655302402937,
"acc_stderr": 0.027836439930903466,
"acc_norm": 0.7725041323170285,
"acc_norm_stderr": 0.028368985132315505,
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.6354665689810353,
"mc2_stderr": 0.014745932385231659
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.6563433578968333,
"acc_stderr": 0.004739575380508865,
"acc_norm": 0.8517227643895638,
"acc_norm_stderr": 0.0035464830155691176
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9144736842105263,
"acc_stderr": 0.02275867713088861,
"acc_norm": 0.9144736842105263,
"acc_norm_stderr": 0.02275867713088861
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372277,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.701058201058201,
"acc_stderr": 0.023577604791655805,
"acc_norm": 0.701058201058201,
"acc_norm_stderr": 0.023577604791655805
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9064516129032258,
"acc_stderr": 0.016565754668270972,
"acc_norm": 0.9064516129032258,
"acc_norm_stderr": 0.016565754668270972
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.02602465765165619,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.02602465765165619
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.017646526677233335,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.017646526677233335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.019880165406588803,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.019880165406588803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03029677128606732,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03029677128606732
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8739495798319328,
"acc_stderr": 0.021559623121213928,
"acc_norm": 0.8739495798319328,
"acc_norm_stderr": 0.021559623121213928
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9119266055045872,
"acc_stderr": 0.012150743719481662,
"acc_norm": 0.9119266055045872,
"acc_norm_stderr": 0.012150743719481662
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6712962962962963,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.6712962962962963,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.01926932302564026,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.01926932302564026
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342344,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342344
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.02684576505455385,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.02684576505455385
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.625,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.9223300970873787,
"acc_stderr": 0.026501440784762752,
"acc_norm": 0.9223300970873787,
"acc_norm_stderr": 0.026501440784762752
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311357,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311357
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292849,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292849
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.019829299214925416,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.019829299214925416
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7787709497206704,
"acc_stderr": 0.013882164598887282,
"acc_norm": 0.7787709497206704,
"acc_norm_stderr": 0.013882164598887282
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.019704039183859816,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.019704039183859816
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8327974276527331,
"acc_stderr": 0.02119387252803497,
"acc_norm": 0.8327974276527331,
"acc_norm_stderr": 0.02119387252803497
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.01810541409432967,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.01810541409432967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6737588652482269,
"acc_stderr": 0.02796845304356316,
"acc_norm": 0.6737588652482269,
"acc_norm_stderr": 0.02796845304356316
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6127770534550195,
"acc_stderr": 0.012441155326854931,
"acc_norm": 0.6127770534550195,
"acc_norm_stderr": 0.012441155326854931
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.022571771025494757,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.022571771025494757
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.023420972069166344,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.023420972069166344
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824657,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594176,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.6354665689810353,
"mc2_stderr": 0.014745932385231659
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028217
},
"harness|gsm8k|5": {
"acc": 0.7164518574677786,
"acc_stderr": 0.01241507091750812
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Astralis-4x34B | [
"region:us"
] | 2024-01-16T12:48:15+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Astralis-4x34B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Astralis-4x34B](https://huggingface.co/Weyaxi/Astralis-4x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Astralis-4x34B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T12:46:01.846891](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Astralis-4x34B/blob/main/results_2024-01-16T12-46-01.846891.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7689655302402937,\n \"acc_stderr\": 0.027836439930903466,\n \"acc_norm\": 0.7725041323170285,\n \"acc_norm_stderr\": 0.028368985132315505,\n \"mc1\": 0.4638922888616891,\n \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.6354665689810353,\n \"mc2_stderr\": 0.014745932385231659\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6563433578968333,\n \"acc_stderr\": 0.004739575380508865,\n \"acc_norm\": 0.8517227643895638,\n \"acc_norm_stderr\": 0.0035464830155691176\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9144736842105263,\n \"acc_stderr\": 0.02275867713088861,\n \"acc_norm\": 0.9144736842105263,\n \"acc_norm_stderr\": 0.02275867713088861\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372277,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372277\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.701058201058201,\n \"acc_stderr\": 0.023577604791655805,\n \"acc_norm\": 0.701058201058201,\n \"acc_norm_stderr\": 0.023577604791655805\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9064516129032258,\n \"acc_stderr\": 0.016565754668270972,\n \"acc_norm\": 0.9064516129032258,\n \"acc_norm_stderr\": 0.016565754668270972\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.02602465765165619,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.02602465765165619\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.017646526677233335,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.017646526677233335\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588803,\n \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03029677128606732,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03029677128606732\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8739495798319328,\n \"acc_stderr\": 0.021559623121213928,\n \"acc_norm\": 0.8739495798319328,\n \"acc_norm_stderr\": 0.021559623121213928\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9119266055045872,\n \"acc_stderr\": 0.012150743719481662,\n \"acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.012150743719481662\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6712962962962963,\n \"acc_stderr\": 0.032036140846700596,\n \"acc_norm\": 0.6712962962962963,\n \"acc_norm_stderr\": 0.032036140846700596\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.01926932302564026,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.01926932302564026\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342344,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342344\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.02684576505455385,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.02684576505455385\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9223300970873787,\n \"acc_stderr\": 0.026501440784762752,\n \"acc_norm\": 0.9223300970873787,\n \"acc_norm_stderr\": 0.026501440784762752\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311357,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311357\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n \"acc_stderr\": 0.010397417087292849,\n \"acc_norm\": 0.9067688378033205,\n \"acc_norm_stderr\": 0.010397417087292849\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.019829299214925416,\n \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.019829299214925416\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7787709497206704,\n \"acc_stderr\": 0.013882164598887282,\n \"acc_norm\": 0.7787709497206704,\n \"acc_norm_stderr\": 0.013882164598887282\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.019704039183859816,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.019704039183859816\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8327974276527331,\n \"acc_stderr\": 0.02119387252803497,\n \"acc_norm\": 0.8327974276527331,\n \"acc_norm_stderr\": 0.02119387252803497\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.01810541409432967,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.01810541409432967\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6737588652482269,\n \"acc_stderr\": 0.02796845304356316,\n \"acc_norm\": 0.6737588652482269,\n \"acc_norm_stderr\": 0.02796845304356316\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6127770534550195,\n \"acc_stderr\": 0.012441155326854931,\n \"acc_norm\": 0.6127770534550195,\n \"acc_norm_stderr\": 0.012441155326854931\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.022571771025494757,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.022571771025494757\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.023420972069166344,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.023420972069166344\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824657,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594176,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594176\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4638922888616891,\n \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.6354665689810353,\n \"mc2_stderr\": 0.014745932385231659\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028217\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7164518574677786,\n \"acc_stderr\": 0.01241507091750812\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Astralis-4x34B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|arc:challenge|25_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|gsm8k|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hellaswag|10_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T12-46-01.846891.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["**/details_harness|winogrande|5_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T12-46-01.846891.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T12_46_01.846891", "path": ["results_2024-01-16T12-46-01.846891.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T12-46-01.846891.parquet"]}]}]} | 2024-01-16T12:48:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Astralis-4x34B
Dataset automatically created during the evaluation run of model Weyaxi/Astralis-4x34B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T12:46:01.846891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Astralis-4x34B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Astralis-4x34B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T12:46:01.846891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Astralis-4x34B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Astralis-4x34B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T12:46:01.846891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b63d9462e42b433ca1f0b9a35f6f79582a71c0de | # Dataset Card for "myridade_dbg_aligned_ontologie_filter_myriade_int_label"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/myridade_dbg_aligned_ontologie_filter_myriade_int_label | [
"region:us"
] | 2024-01-16T12:51:09+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 61550504, "num_examples": 98206}], "download_size": 13126741, "dataset_size": 61550504}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T13:58:11+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "myridade_dbg_aligned_ontologie_filter_myriade_int_label"
More Information needed | [
"# Dataset Card for \"myridade_dbg_aligned_ontologie_filter_myriade_int_label\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"myridade_dbg_aligned_ontologie_filter_myriade_int_label\"\n\nMore Information needed"
] |
68dce8c35400f3a8ae79622fad12524829e3006b | # Dataset Card for "ner_vir_naeus_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mikrz/ner_vir_naeus_dataset | [
"region:us"
] | 2024-01-16T12:55:27+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "int32"}, {"name": "tokens", "sequence": "string"}, {"name": "ner_tags", "sequence": {"class_label": {"names": {"0": "B-BAC", "1": "I-BAC", "2": "B-VIR", "3": "I-VIR", "4": "O"}}}}], "splits": [{"name": "train", "num_bytes": 90354434, "num_examples": 23589}, {"name": "test", "num_bytes": 28940230, "num_examples": 7583}, {"name": "valid", "num_bytes": 9749348, "num_examples": 2527}], "download_size": 19649467, "dataset_size": 129044012}} | 2024-01-16T12:55:35+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ner_vir_naeus_dataset"
More Information needed | [
"# Dataset Card for \"ner_vir_naeus_dataset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ner_vir_naeus_dataset\"\n\nMore Information needed"
] |
530b0b6ccd99a03c559aed30c9d94105faf5fc4b |
# Dataset Card for Evaluation run of gagan3012/Multilingual-mistral
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gagan3012/Multilingual-mistral](https://huggingface.co/gagan3012/Multilingual-mistral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gagan3012__Multilingual-mistral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T13:00:17.256624](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multilingual-mistral/blob/main/results_2024-01-16T13-00-17.256624.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6125296970145792,
"acc_stderr": 0.032987213837153834,
"acc_norm": 0.6173816622782963,
"acc_norm_stderr": 0.03365203917647103,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.017151605555749138,
"mc2": 0.555285737063638,
"mc2_stderr": 0.015637031939929425
},
"harness|arc:challenge|25": {
"acc": 0.5938566552901023,
"acc_stderr": 0.014351656690097862,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192593
},
"harness|hellaswag|10": {
"acc": 0.6274646484763992,
"acc_stderr": 0.004824917516374184,
"acc_norm": 0.8175662218681538,
"acc_norm_stderr": 0.00385412337350911
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949646,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949646
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067877,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067877
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.02797605491534735,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.02797605491534735
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848023,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848023
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.038448761397852714,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.038448761397852714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7905491698595147,
"acc_stderr": 0.014551310568143705,
"acc_norm": 0.7905491698595147,
"acc_norm_stderr": 0.014551310568143705
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.02513100023364789,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.02513100023364789
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33631284916201115,
"acc_stderr": 0.01580100372914589,
"acc_norm": 0.33631284916201115,
"acc_norm_stderr": 0.01580100372914589
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4406779661016949,
"acc_stderr": 0.012680037994097079,
"acc_norm": 0.4406779661016949,
"acc_norm_stderr": 0.012680037994097079
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.017151605555749138,
"mc2": 0.555285737063638,
"mc2_stderr": 0.015637031939929425
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
},
"harness|gsm8k|5": {
"acc": 0.4025777103866566,
"acc_stderr": 0.013508523063663439
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_gagan3012__Multilingual-mistral | [
"region:us"
] | 2024-01-16T13:02:35+00:00 | {"pretty_name": "Evaluation run of gagan3012/Multilingual-mistral", "dataset_summary": "Dataset automatically created during the evaluation run of model [gagan3012/Multilingual-mistral](https://huggingface.co/gagan3012/Multilingual-mistral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gagan3012__Multilingual-mistral\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T13:00:17.256624](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multilingual-mistral/blob/main/results_2024-01-16T13-00-17.256624.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6125296970145792,\n \"acc_stderr\": 0.032987213837153834,\n \"acc_norm\": 0.6173816622782963,\n \"acc_norm_stderr\": 0.03365203917647103,\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.555285737063638,\n \"mc2_stderr\": 0.015637031939929425\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5938566552901023,\n \"acc_stderr\": 0.014351656690097862,\n \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192593\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6274646484763992,\n \"acc_stderr\": 0.004824917516374184,\n \"acc_norm\": 0.8175662218681538,\n \"acc_norm_stderr\": 0.00385412337350911\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949646,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949646\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5903225806451613,\n \"acc_stderr\": 0.02797605491534735,\n \"acc_norm\": 0.5903225806451613,\n \"acc_norm_stderr\": 0.02797605491534735\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848023,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848023\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.029102254389674082,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.029102254389674082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.038448761397852714,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.038448761397852714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n \"acc_stderr\": 0.014551310568143705,\n \"acc_norm\": 0.7905491698595147,\n \"acc_norm_stderr\": 0.014551310568143705\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.02513100023364789,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.02513100023364789\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n \"acc_stderr\": 0.01580100372914589,\n \"acc_norm\": 0.33631284916201115,\n \"acc_norm_stderr\": 0.01580100372914589\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n \"acc_stderr\": 0.012680037994097079,\n \"acc_norm\": 0.4406779661016949,\n \"acc_norm_stderr\": 0.012680037994097079\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.555285737063638,\n \"mc2_stderr\": 0.015637031939929425\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4025777103866566,\n \"acc_stderr\": 0.013508523063663439\n }\n}\n```", "repo_url": "https://huggingface.co/gagan3012/Multilingual-mistral", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|arc:challenge|25_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|gsm8k|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hellaswag|10_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T13-00-17.256624.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["**/details_harness|winogrande|5_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T13-00-17.256624.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T13_00_17.256624", "path": ["results_2024-01-16T13-00-17.256624.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T13-00-17.256624.parquet"]}]}]} | 2024-01-16T13:02:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of gagan3012/Multilingual-mistral
Dataset automatically created during the evaluation run of model gagan3012/Multilingual-mistral on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T13:00:17.256624(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of gagan3012/Multilingual-mistral\n\n\n\nDataset automatically created during the evaluation run of model gagan3012/Multilingual-mistral on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T13:00:17.256624(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of gagan3012/Multilingual-mistral\n\n\n\nDataset automatically created during the evaluation run of model gagan3012/Multilingual-mistral on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T13:00:17.256624(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
92ca309f2e823b951bb23fa8a2c61d198b5f1622 |
This is the dataset corresponding to the paper's experiments, used to reproduce the accuracy mentioned in the paper.
[POPE: 6-DoF Promptable Pose Estimation of Any Object, in Any Scene, with One Reference](https://arxiv.org/abs/2305.15727)
Please download and unzip the dataset into './data'.
| paulpanwang/POPE_Dataset | [
"license:mit",
"arxiv:2305.15727",
"region:us"
] | 2024-01-16T13:07:21+00:00 | {"license": "mit"} | 2024-01-16T14:15:02+00:00 | [
"2305.15727"
] | [] | TAGS
#license-mit #arxiv-2305.15727 #region-us
|
This is the dataset corresponding to the paper's experiments, used to reproduce the accuracy mentioned in the paper.
POPE: 6-DoF Promptable Pose Estimation of Any Object, in Any Scene, with One Reference
Please download and unzip the dataset into './data'.
| [] | [
"TAGS\n#license-mit #arxiv-2305.15727 #region-us \n"
] |
24dc5978096d550739c6d7fc4f2b381423e1e070 |
# Company Reports Dataset
## Description
This dataset contains ESG (Environmental, Social, and Governance) sustainability reports from various companies. It includes data like company details, report categories, textual analysis of the reports, and more.
## Dataset Structure
- `id`: Unique identifier for each report entry.
- `document_category`: Classification of the document (e.g., ESG sustainability report).
- `year`: Publication year of the report.
- `company_name`: Name of the respective company.
- `company_description`: A concise description of the company.
- `company_website`: The official website URL of the company.
- `economic_activity`: Sector of economic activity.
- `file_name`: Filename of the report.
- `url`: Direct URL to access the report.
- `downloaded_at`: Date and time when the report was downloaded.
- `text_analyzed`: The analyzed text extracted from the report.
- `tot_text_cleaned`: The cleaned version of the report text.
- `tot_text_raw`: The original, unedited text of the report.
- `documents_description`: A short description of the documents.
## Data Processing Versions
Different versions of the dataset are available, each processed with specific methodologies:
### Version: 2024_01_19
- **Extraction Methodology**: Utilizing 'unstructured' package with the following parameters:
- Partition method with strategy = fast
- **Translation Methodology**:
- Individual element_id translation using the multilingual model 'facebook/m2m100_418M'
- **Feature Extraction Methodology**:
- Concatenation of cleaned and translated element_ids, with GPT-3.5-turbo model applied to the first 2500 characters for JSON extraction.
### Versions: 2024_01_21, 2024_01_22
- **Extraction Methodology**: Employing 'unstructured' package with the following parameters:
- Partition_pdf method with strategy = auto
- **Translation Methodology**:
- Aggregating into chunks, translating each chunk using Helsinki-NLP/opus-mt-{source_lang}-{target_lang} models. Language detection on each chunk is performed using langdetect.detect(text_cleaned[:100]).
- **Feature Extraction Methodology**:
- Concatenation of cleaned and translated chunks, with GPT-3.5-turbo model applied to the first 2500 characters for JSON extraction.
### Version: 2024_01_23
- [Details to be updated]
## Collection Methodology
The dataset was collected from publicly available ESG reports of various companies to represent a wide range of industries.
## Intended Use
This dataset is suitable for tasks like text analysis, ESG metric analysis, corporate sustainability practice research, and more.
## Useful link
https://huggingface.co/docs/datasets/share
## Licensing
The dataset is available under the [CC-BY-SA-4.0](https://creativecommons.org
/licenses/by-sa/4.0/) license. Please ensure to adhere to the terms of this license when using or distributing this dataset.
| DataNeed/company-reports | [
"task_categories:text-classification",
"language:en",
"license:cc-by-sa-4.0",
"esg reports",
"sustainability",
"corporate governance",
"environmental",
"region:us"
] | 2024-01-16T13:35:55+00:00 | {"language": ["en"], "license": "cc-by-sa-4.0", "task_categories": ["text-classification"], "pretty_name": "Company Reports Dataset", "tags": ["esg reports", "sustainability", "corporate governance", "environmental"], "configs": [{"config_name": "default", "data_files": [{"split": "2024_02_03", "path": ["data/company_reports_2024_02_03.json"]}, {"split": "2024_01_23", "path": ["data/company_reports_2024_01_23.json"]}, {"split": "2024_01_22", "path": ["data/company_reports_2024_01_22.json"]}, {"split": "2024_01_21", "path": ["data/company_reports_2024_01_21.json"]}, {"split": "2024_01_19", "path": ["data/company_reports_2024_01_19.json"]}]}]} | 2024-02-07T10:15:51+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #language-English #license-cc-by-sa-4.0 #esg reports #sustainability #corporate governance #environmental #region-us
|
# Company Reports Dataset
## Description
This dataset contains ESG (Environmental, Social, and Governance) sustainability reports from various companies. It includes data like company details, report categories, textual analysis of the reports, and more.
## Dataset Structure
- 'id': Unique identifier for each report entry.
- 'document_category': Classification of the document (e.g., ESG sustainability report).
- 'year': Publication year of the report.
- 'company_name': Name of the respective company.
- 'company_description': A concise description of the company.
- 'company_website': The official website URL of the company.
- 'economic_activity': Sector of economic activity.
- 'file_name': Filename of the report.
- 'url': Direct URL to access the report.
- 'downloaded_at': Date and time when the report was downloaded.
- 'text_analyzed': The analyzed text extracted from the report.
- 'tot_text_cleaned': The cleaned version of the report text.
- 'tot_text_raw': The original, unedited text of the report.
- 'documents_description': A short description of the documents.
## Data Processing Versions
Different versions of the dataset are available, each processed with specific methodologies:
### Version: 2024_01_19
- Extraction Methodology: Utilizing 'unstructured' package with the following parameters:
- Partition method with strategy = fast
- Translation Methodology:
- Individual element_id translation using the multilingual model 'facebook/m2m100_418M'
- Feature Extraction Methodology:
- Concatenation of cleaned and translated element_ids, with GPT-3.5-turbo model applied to the first 2500 characters for JSON extraction.
### Versions: 2024_01_21, 2024_01_22
- Extraction Methodology: Employing 'unstructured' package with the following parameters:
- Partition_pdf method with strategy = auto
- Translation Methodology:
- Aggregating into chunks, translating each chunk using Helsinki-NLP/opus-mt-{source_lang}-{target_lang} models. Language detection on each chunk is performed using URL(text_cleaned[:100]).
- Feature Extraction Methodology:
- Concatenation of cleaned and translated chunks, with GPT-3.5-turbo model applied to the first 2500 characters for JSON extraction.
### Version: 2024_01_23
- [Details to be updated]
## Collection Methodology
The dataset was collected from publicly available ESG reports of various companies to represent a wide range of industries.
## Intended Use
This dataset is suitable for tasks like text analysis, ESG metric analysis, corporate sustainability practice research, and more.
## Useful link
URL
## Licensing
The dataset is available under the CC-BY-SA-4.0 license. Please ensure to adhere to the terms of this license when using or distributing this dataset.
| [
"# Company Reports Dataset",
"## Description\n\nThis dataset contains ESG (Environmental, Social, and Governance) sustainability reports from various companies. It includes data like company details, report categories, textual analysis of the reports, and more.",
"## Dataset Structure\n\n- 'id': Unique identifier for each report entry.\n- 'document_category': Classification of the document (e.g., ESG sustainability report).\n- 'year': Publication year of the report.\n- 'company_name': Name of the respective company.\n- 'company_description': A concise description of the company.\n- 'company_website': The official website URL of the company.\n- 'economic_activity': Sector of economic activity.\n- 'file_name': Filename of the report.\n- 'url': Direct URL to access the report.\n- 'downloaded_at': Date and time when the report was downloaded.\n- 'text_analyzed': The analyzed text extracted from the report.\n- 'tot_text_cleaned': The cleaned version of the report text.\n- 'tot_text_raw': The original, unedited text of the report.\n- 'documents_description': A short description of the documents.",
"## Data Processing Versions\n\nDifferent versions of the dataset are available, each processed with specific methodologies:",
"### Version: 2024_01_19\n\n- Extraction Methodology: Utilizing 'unstructured' package with the following parameters:\n - Partition method with strategy = fast\n- Translation Methodology:\n - Individual element_id translation using the multilingual model 'facebook/m2m100_418M'\n- Feature Extraction Methodology:\n - Concatenation of cleaned and translated element_ids, with GPT-3.5-turbo model applied to the first 2500 characters for JSON extraction.",
"### Versions: 2024_01_21, 2024_01_22\n\n- Extraction Methodology: Employing 'unstructured' package with the following parameters:\n - Partition_pdf method with strategy = auto\n- Translation Methodology:\n - Aggregating into chunks, translating each chunk using Helsinki-NLP/opus-mt-{source_lang}-{target_lang} models. Language detection on each chunk is performed using URL(text_cleaned[:100]).\n- Feature Extraction Methodology:\n - Concatenation of cleaned and translated chunks, with GPT-3.5-turbo model applied to the first 2500 characters for JSON extraction.",
"### Version: 2024_01_23\n\n- [Details to be updated]",
"## Collection Methodology\n\nThe dataset was collected from publicly available ESG reports of various companies to represent a wide range of industries.",
"## Intended Use\n\nThis dataset is suitable for tasks like text analysis, ESG metric analysis, corporate sustainability practice research, and more.",
"## Useful link\n\nURL",
"## Licensing\n\nThe dataset is available under the CC-BY-SA-4.0 license. Please ensure to adhere to the terms of this license when using or distributing this dataset."
] | [
"TAGS\n#task_categories-text-classification #language-English #license-cc-by-sa-4.0 #esg reports #sustainability #corporate governance #environmental #region-us \n",
"# Company Reports Dataset",
"## Description\n\nThis dataset contains ESG (Environmental, Social, and Governance) sustainability reports from various companies. It includes data like company details, report categories, textual analysis of the reports, and more.",
"## Dataset Structure\n\n- 'id': Unique identifier for each report entry.\n- 'document_category': Classification of the document (e.g., ESG sustainability report).\n- 'year': Publication year of the report.\n- 'company_name': Name of the respective company.\n- 'company_description': A concise description of the company.\n- 'company_website': The official website URL of the company.\n- 'economic_activity': Sector of economic activity.\n- 'file_name': Filename of the report.\n- 'url': Direct URL to access the report.\n- 'downloaded_at': Date and time when the report was downloaded.\n- 'text_analyzed': The analyzed text extracted from the report.\n- 'tot_text_cleaned': The cleaned version of the report text.\n- 'tot_text_raw': The original, unedited text of the report.\n- 'documents_description': A short description of the documents.",
"## Data Processing Versions\n\nDifferent versions of the dataset are available, each processed with specific methodologies:",
"### Version: 2024_01_19\n\n- Extraction Methodology: Utilizing 'unstructured' package with the following parameters:\n - Partition method with strategy = fast\n- Translation Methodology:\n - Individual element_id translation using the multilingual model 'facebook/m2m100_418M'\n- Feature Extraction Methodology:\n - Concatenation of cleaned and translated element_ids, with GPT-3.5-turbo model applied to the first 2500 characters for JSON extraction.",
"### Versions: 2024_01_21, 2024_01_22\n\n- Extraction Methodology: Employing 'unstructured' package with the following parameters:\n - Partition_pdf method with strategy = auto\n- Translation Methodology:\n - Aggregating into chunks, translating each chunk using Helsinki-NLP/opus-mt-{source_lang}-{target_lang} models. Language detection on each chunk is performed using URL(text_cleaned[:100]).\n- Feature Extraction Methodology:\n - Concatenation of cleaned and translated chunks, with GPT-3.5-turbo model applied to the first 2500 characters for JSON extraction.",
"### Version: 2024_01_23\n\n- [Details to be updated]",
"## Collection Methodology\n\nThe dataset was collected from publicly available ESG reports of various companies to represent a wide range of industries.",
"## Intended Use\n\nThis dataset is suitable for tasks like text analysis, ESG metric analysis, corporate sustainability practice research, and more.",
"## Useful link\n\nURL",
"## Licensing\n\nThe dataset is available under the CC-BY-SA-4.0 license. Please ensure to adhere to the terms of this license when using or distributing this dataset."
] |
e6f8441f19368ab557835f261f8791e9fc9376ab |
Fork of [jihyoung/ConversationChronicles](https://huggingface.co/datasets/jihyoung/ConversationChronicles?row=0) | aloobun/bun_multi_convo | [
"task_categories:conversational",
"language:en",
"license:cc-by-4.0",
"region:us"
] | 2024-01-16T13:44:39+00:00 | {"language": ["en"], "license": "cc-by-4.0", "task_categories": ["conversational"]} | 2024-01-16T17:18:12+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #language-English #license-cc-by-4.0 #region-us
|
Fork of jihyoung/ConversationChronicles | [] | [
"TAGS\n#task_categories-conversational #language-English #license-cc-by-4.0 #region-us \n"
] |
e41f3ecf33886c3dc57a81fa938b2d8ad2c59a53 | # 🇻🇳 Vietnamese Self-Chat Dataset
This dataset is designed to enhance the model's ability to engage in multi-turn conversations with humans.
To construct this dataset, we follow a two-step process:
- Step 1: Instruction Generation
We employ the methodology outlined in the [Self-Instruct paper](https://arxiv.org/abs/2212.10560) to craft a diverse set of instructions. This paper serves as a guide for aligning pretrained language models with specific instructions, providing a structured foundation for subsequent dialogue generation.
- Step 2: Synthetic Self-Chat Conversations
Building upon the instructions generated in the first step, we draw inspiration from the [Baize paper](https://arxiv.org/abs/2304.01196). The goal is to simulate synthetic multi-turn interactions that the model can learn from.
By combining these two steps, we aim to create a robust and versatile dataset that empowers the model to navigate and contribute effectively in complex conversational scenarios. This dataset serves as a valuable resource for refining the model's language understanding and response generation capabilities in the context of human-like dialogue.
| bkai-foundation-models/vi-self-chat-sharegpt-format | [
"arxiv:2212.10560",
"arxiv:2304.01196",
"region:us"
] | 2024-01-16T14:04:27+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 77553076, "num_examples": 30399}], "download_size": 32137459, "dataset_size": 77553076}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T07:35:31+00:00 | [
"2212.10560",
"2304.01196"
] | [] | TAGS
#arxiv-2212.10560 #arxiv-2304.01196 #region-us
| # 🇻🇳 Vietnamese Self-Chat Dataset
This dataset is designed to enhance the model's ability to engage in multi-turn conversations with humans.
To construct this dataset, we follow a two-step process:
- Step 1: Instruction Generation
We employ the methodology outlined in the Self-Instruct paper to craft a diverse set of instructions. This paper serves as a guide for aligning pretrained language models with specific instructions, providing a structured foundation for subsequent dialogue generation.
- Step 2: Synthetic Self-Chat Conversations
Building upon the instructions generated in the first step, we draw inspiration from the Baize paper. The goal is to simulate synthetic multi-turn interactions that the model can learn from.
By combining these two steps, we aim to create a robust and versatile dataset that empowers the model to navigate and contribute effectively in complex conversational scenarios. This dataset serves as a valuable resource for refining the model's language understanding and response generation capabilities in the context of human-like dialogue.
| [
"# 🇻🇳 Vietnamese Self-Chat Dataset\n\nThis dataset is designed to enhance the model's ability to engage in multi-turn conversations with humans.\n\nTo construct this dataset, we follow a two-step process:\n\n- Step 1: Instruction Generation\nWe employ the methodology outlined in the Self-Instruct paper to craft a diverse set of instructions. This paper serves as a guide for aligning pretrained language models with specific instructions, providing a structured foundation for subsequent dialogue generation.\n\n- Step 2: Synthetic Self-Chat Conversations\nBuilding upon the instructions generated in the first step, we draw inspiration from the Baize paper. The goal is to simulate synthetic multi-turn interactions that the model can learn from.\n\nBy combining these two steps, we aim to create a robust and versatile dataset that empowers the model to navigate and contribute effectively in complex conversational scenarios. This dataset serves as a valuable resource for refining the model's language understanding and response generation capabilities in the context of human-like dialogue."
] | [
"TAGS\n#arxiv-2212.10560 #arxiv-2304.01196 #region-us \n",
"# 🇻🇳 Vietnamese Self-Chat Dataset\n\nThis dataset is designed to enhance the model's ability to engage in multi-turn conversations with humans.\n\nTo construct this dataset, we follow a two-step process:\n\n- Step 1: Instruction Generation\nWe employ the methodology outlined in the Self-Instruct paper to craft a diverse set of instructions. This paper serves as a guide for aligning pretrained language models with specific instructions, providing a structured foundation for subsequent dialogue generation.\n\n- Step 2: Synthetic Self-Chat Conversations\nBuilding upon the instructions generated in the first step, we draw inspiration from the Baize paper. The goal is to simulate synthetic multi-turn interactions that the model can learn from.\n\nBy combining these two steps, we aim to create a robust and versatile dataset that empowers the model to navigate and contribute effectively in complex conversational scenarios. This dataset serves as a valuable resource for refining the model's language understanding and response generation capabilities in the context of human-like dialogue."
] |
ee0e992c458381e7a70580c0bfa434b7e302f419 |
This is a subset (1000 samples) of [`databricks/databricks-dolly-15k`](https://huggingface.co/datasets/databricks/databricks-dolly-15k) dataset, processed to match Mistral-7B-instruct-v0.2's prompt format. It was created using the [colab notebook](https://colab.research.google.com/drive/1sRy-FT4nqOKG9_K6i4txgDKYkOAdCfBQ?usp=sharing).
| wenqiglantz/databricks-dolly-1k | [
"region:us"
] | 2024-01-16T14:09:08+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1338157, "num_examples": 1000}], "download_size": 842842, "dataset_size": 1338157}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T14:12:00+00:00 | [] | [] | TAGS
#region-us
|
This is a subset (1000 samples) of 'databricks/databricks-dolly-15k' dataset, processed to match Mistral-7B-instruct-v0.2's prompt format. It was created using the colab notebook.
| [] | [
"TAGS\n#region-us \n"
] |
306409d45eadeb0416737e24f5349f12c729cd82 | This dataset contains images used in the Open-source AI cookbook: https://github.com/huggingface/cookbook.
Please make sure you optimize the assets before uploading them (e.g. using https://tinypng.com/). | huggingface/cookbook-images | [
"region:us"
] | 2024-01-16T14:29:31+00:00 | {} | 2024-01-26T15:44:47+00:00 | [] | [] | TAGS
#region-us
| This dataset contains images used in the Open-source AI cookbook: URL
Please make sure you optimize the assets before uploading them (e.g. using URL | [] | [
"TAGS\n#region-us \n"
] |
b3df72df5a7ddaf3e11ef723dce598cb32adb9f5 |
# Dataset Card for Evaluation run of Cartinoe5930/SOLAR-DUS-implement
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Cartinoe5930/SOLAR-DUS-implement](https://huggingface.co/Cartinoe5930/SOLAR-DUS-implement) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T14:37:28.066845](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement/blob/main/results_2024-01-16T14-37-28.066845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6312296500454484,
"acc_stderr": 0.0323614114970197,
"acc_norm": 0.6390797710653894,
"acc_norm_stderr": 0.033030038319899674,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082683,
"mc2": 0.4071642776487792,
"mc2_stderr": 0.01422601728098354
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.014506769524804241,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.6122286397132045,
"acc_stderr": 0.004862461799370392,
"acc_norm": 0.811790479984067,
"acc_norm_stderr": 0.003900805416736719
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368881,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368881
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097113,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097113
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739154,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.03338473403207401,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.03338473403207401
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001503,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001503
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.02500931379006971,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.02500931379006971
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360276,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868055,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868055
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082683,
"mc2": 0.4071642776487792,
"mc2_stderr": 0.01422601728098354
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650877
},
"harness|gsm8k|5": {
"acc": 0.2699014404852161,
"acc_stderr": 0.012227442856468897
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement | [
"region:us"
] | 2024-01-16T14:34:07+00:00 | {"pretty_name": "Evaluation run of Cartinoe5930/SOLAR-DUS-implement", "dataset_summary": "Dataset automatically created during the evaluation run of model [Cartinoe5930/SOLAR-DUS-implement](https://huggingface.co/Cartinoe5930/SOLAR-DUS-implement) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T14:37:28.066845](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement/blob/main/results_2024-01-16T14-37-28.066845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6312296500454484,\n \"acc_stderr\": 0.0323614114970197,\n \"acc_norm\": 0.6390797710653894,\n \"acc_norm_stderr\": 0.033030038319899674,\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.01522589934082683,\n \"mc2\": 0.4071642776487792,\n \"mc2_stderr\": 0.01422601728098354\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804241,\n \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6122286397132045,\n \"acc_stderr\": 0.004862461799370392,\n \"acc_norm\": 0.811790479984067,\n \"acc_norm_stderr\": 0.003900805416736719\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368881,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368881\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097113,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097113\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739154,\n \"acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739154\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.03338473403207401,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.03338473403207401\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001503,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001503\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.02500931379006971,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.02500931379006971\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868055,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868055\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.01522589934082683,\n \"mc2\": 0.4071642776487792,\n \"mc2_stderr\": 0.01422601728098354\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650877\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2699014404852161,\n \"acc_stderr\": 0.012227442856468897\n }\n}\n```", "repo_url": "https://huggingface.co/Cartinoe5930/SOLAR-DUS-implement", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-31-52.747205.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-37-28.066845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["**/details_harness|winogrande|5_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["**/details_harness|winogrande|5_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T14-37-28.066845.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T14_31_52.747205", "path": ["results_2024-01-16T14-31-52.747205.parquet"]}, {"split": "2024_01_16T14_37_28.066845", "path": ["results_2024-01-16T14-37-28.066845.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T14-37-28.066845.parquet"]}]}]} | 2024-01-16T14:39:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Cartinoe5930/SOLAR-DUS-implement
Dataset automatically created during the evaluation run of model Cartinoe5930/SOLAR-DUS-implement on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T14:37:28.066845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Cartinoe5930/SOLAR-DUS-implement\n\n\n\nDataset automatically created during the evaluation run of model Cartinoe5930/SOLAR-DUS-implement on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T14:37:28.066845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Cartinoe5930/SOLAR-DUS-implement\n\n\n\nDataset automatically created during the evaluation run of model Cartinoe5930/SOLAR-DUS-implement on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T14:37:28.066845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
75df64b0559265a3ca3ab73d07dfde5461d9c3bc |
# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xriminact/TarsChattyBasev0.2](https://huggingface.co/xriminact/TarsChattyBasev0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T14:41:25.883812](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2/blob/main/results_2024-01-16T14-41-25.883812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48620348001542985,
"acc_stderr": 0.0348281020538984,
"acc_norm": 0.48570430699642225,
"acc_norm_stderr": 0.03554381019405187,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184824,
"mc2": 0.4378655189377286,
"mc2_stderr": 0.01517288496510812
},
"harness|arc:challenge|25": {
"acc": 0.4761092150170648,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.5221843003412969,
"acc_norm_stderr": 0.014597001927076135
},
"harness|hellaswag|10": {
"acc": 0.5952001593308106,
"acc_stderr": 0.004898501014225837,
"acc_norm": 0.7778331009759012,
"acc_norm_stderr": 0.004148531608981493
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5169811320754717,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.5169811320754717,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743743,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5709677419354838,
"acc_stderr": 0.028156036538233193,
"acc_norm": 0.5709677419354838,
"acc_norm_stderr": 0.028156036538233193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.503030303030303,
"acc_stderr": 0.039042723414318574,
"acc_norm": 0.503030303030303,
"acc_norm_stderr": 0.039042723414318574
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.02533466708095495,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.02533466708095495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6091743119266055,
"acc_stderr": 0.020920058346111055,
"acc_norm": 0.6091743119266055,
"acc_norm_stderr": 0.020920058346111055
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.034658681963807614,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.034658681963807614
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5907172995780591,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.5907172995780591,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.043171711948702556,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.043171711948702556
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.030572811310299604,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.030572811310299604
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6053639846743295,
"acc_stderr": 0.017478464305911542,
"acc_norm": 0.6053639846743295,
"acc_norm_stderr": 0.017478464305911542
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.01581390128391305,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.01581390128391305
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5065359477124183,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.5065359477124183,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5209003215434084,
"acc_stderr": 0.028373270961069414,
"acc_norm": 0.5209003215434084,
"acc_norm_stderr": 0.028373270961069414
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347243,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347243
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35267275097783574,
"acc_stderr": 0.012203286846053887,
"acc_norm": 0.35267275097783574,
"acc_norm_stderr": 0.012203286846053887
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.047764491623961985,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.047764491623961985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184824,
"mc2": 0.4378655189377286,
"mc2_stderr": 0.01517288496510812
},
"harness|winogrande|5": {
"acc": 0.6945540647198106,
"acc_stderr": 0.012945038632552032
},
"harness|gsm8k|5": {
"acc": 0.5360121304018196,
"acc_stderr": 0.013736715929950315
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2 | [
"region:us"
] | 2024-01-16T14:43:49+00:00 | {"pretty_name": "Evaluation run of xriminact/TarsChattyBasev0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [xriminact/TarsChattyBasev0.2](https://huggingface.co/xriminact/TarsChattyBasev0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T14:41:25.883812](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2/blob/main/results_2024-01-16T14-41-25.883812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48620348001542985,\n \"acc_stderr\": 0.0348281020538984,\n \"acc_norm\": 0.48570430699642225,\n \"acc_norm_stderr\": 0.03554381019405187,\n \"mc1\": 0.2913096695226438,\n \"mc1_stderr\": 0.015905987048184824,\n \"mc2\": 0.4378655189377286,\n \"mc2_stderr\": 0.01517288496510812\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4761092150170648,\n \"acc_stderr\": 0.014594701798071654,\n \"acc_norm\": 0.5221843003412969,\n \"acc_norm_stderr\": 0.014597001927076135\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5952001593308106,\n \"acc_stderr\": 0.004898501014225837,\n \"acc_norm\": 0.7778331009759012,\n \"acc_norm_stderr\": 0.004148531608981493\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743743,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743743\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5709677419354838,\n \"acc_stderr\": 0.028156036538233193,\n \"acc_norm\": 0.5709677419354838,\n \"acc_norm_stderr\": 0.028156036538233193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.503030303030303,\n \"acc_stderr\": 0.039042723414318574,\n \"acc_norm\": 0.503030303030303,\n \"acc_norm_stderr\": 0.039042723414318574\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.02533466708095495,\n \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.02533466708095495\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6091743119266055,\n \"acc_stderr\": 0.020920058346111055,\n \"acc_norm\": 0.6091743119266055,\n \"acc_norm_stderr\": 0.020920058346111055\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.034658681963807614,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.034658681963807614\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5907172995780591,\n \"acc_stderr\": 0.032007041833595914,\n \"acc_norm\": 0.5907172995780591,\n \"acc_norm_stderr\": 0.032007041833595914\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.043171711948702556,\n \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.043171711948702556\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.030572811310299604,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.030572811310299604\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6053639846743295,\n \"acc_stderr\": 0.017478464305911542,\n \"acc_norm\": 0.6053639846743295,\n \"acc_norm_stderr\": 0.017478464305911542\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n \"acc_stderr\": 0.01581390128391305,\n \"acc_norm\": 0.33743016759776534,\n \"acc_norm_stderr\": 0.01581390128391305\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556047,\n \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556047\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5209003215434084,\n \"acc_stderr\": 0.028373270961069414,\n \"acc_norm\": 0.5209003215434084,\n \"acc_norm_stderr\": 0.028373270961069414\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35267275097783574,\n \"acc_stderr\": 0.012203286846053887,\n \"acc_norm\": 0.35267275097783574,\n \"acc_norm_stderr\": 0.012203286846053887\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4738562091503268,\n \"acc_stderr\": 0.020200164564804588,\n \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.020200164564804588\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.03660298834049163,\n \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.03660298834049163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n \"mc1_stderr\": 0.015905987048184824,\n \"mc2\": 0.4378655189377286,\n \"mc2_stderr\": 0.01517288496510812\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6945540647198106,\n \"acc_stderr\": 0.012945038632552032\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5360121304018196,\n \"acc_stderr\": 0.013736715929950315\n }\n}\n```", "repo_url": "https://huggingface.co/xriminact/TarsChattyBasev0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-41-25.883812.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["**/details_harness|winogrande|5_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T14-41-25.883812.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T14_41_25.883812", "path": ["results_2024-01-16T14-41-25.883812.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T14-41-25.883812.parquet"]}]}]} | 2024-01-16T14:44:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.2
Dataset automatically created during the evaluation run of model xriminact/TarsChattyBasev0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T14:41:25.883812(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.2\n\n\n\nDataset automatically created during the evaluation run of model xriminact/TarsChattyBasev0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T14:41:25.883812(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.2\n\n\n\nDataset automatically created during the evaluation run of model xriminact/TarsChattyBasev0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T14:41:25.883812(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9cc7a95503b66d7c6aa67887dc2562eae1a4dd3e |
# Dataset Card for Evaluation run of mosaicml/mpt-7b-8k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-8k](https://huggingface.co/mosaicml/mpt-7b-8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b-8k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T14:43:48.109992](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-8k/blob/main/results_2024-01-16T14-43-48.109992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.42743057148449115,
"acc_stderr": 0.03450500692223034,
"acc_norm": 0.4325211129796156,
"acc_norm_stderr": 0.03531856610209659,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.3664779193013505,
"mc2_stderr": 0.013657498594566674
},
"harness|arc:challenge|25": {
"acc": 0.43430034129692835,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.4735494880546075,
"acc_norm_stderr": 0.014590931358120169
},
"harness|hellaswag|10": {
"acc": 0.5732921728739295,
"acc_stderr": 0.004935882666250478,
"acc_norm": 0.7740489942242581,
"acc_norm_stderr": 0.0041735236727608
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779205,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.03724249595817729,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.03724249595817729
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708614,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708614
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4806451612903226,
"acc_stderr": 0.028422687404312107,
"acc_norm": 0.4806451612903226,
"acc_norm_stderr": 0.028422687404312107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5906735751295337,
"acc_stderr": 0.03548608168860806,
"acc_norm": 0.5906735751295337,
"acc_norm_stderr": 0.03548608168860806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36923076923076925,
"acc_stderr": 0.024468615241478912,
"acc_norm": 0.36923076923076925,
"acc_norm_stderr": 0.024468615241478912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5339449541284403,
"acc_stderr": 0.021387863350353982,
"acc_norm": 0.5339449541284403,
"acc_norm_stderr": 0.021387863350353982
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329881,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329881
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.034924061041636124,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.034924061041636124
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5907172995780591,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.5907172995780591,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4798206278026906,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.4798206278026906,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5041322314049587,
"acc_stderr": 0.04564198767432754,
"acc_norm": 0.5041322314049587,
"acc_norm_stderr": 0.04564198767432754
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4539877300613497,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.4539877300613497,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.4368932038834951,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.4368932038834951,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5811965811965812,
"acc_stderr": 0.03232128912157792,
"acc_norm": 0.5811965811965812,
"acc_norm_stderr": 0.03232128912157792
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5542784163473818,
"acc_stderr": 0.0177742972824795,
"acc_norm": 0.5542784163473818,
"acc_norm_stderr": 0.0177742972824795
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4421965317919075,
"acc_stderr": 0.0267386036438074,
"acc_norm": 0.4421965317919075,
"acc_norm_stderr": 0.0267386036438074
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850407,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850407
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.028555827516528777,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.028555827516528777
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5112540192926045,
"acc_stderr": 0.028390897396863526,
"acc_norm": 0.5112540192926045,
"acc_norm_stderr": 0.028390897396863526
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5,
"acc_stderr": 0.02782074420373286,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02782074420373286
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503807,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503807
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3285528031290743,
"acc_stderr": 0.011996027247502901,
"acc_norm": 0.3285528031290743,
"acc_norm_stderr": 0.011996027247502901
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34191176470588236,
"acc_stderr": 0.028814722422254177,
"acc_norm": 0.34191176470588236,
"acc_norm_stderr": 0.028814722422254177
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.020102583895887188,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.020102583895887188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.43673469387755104,
"acc_stderr": 0.03175195237583322,
"acc_norm": 0.43673469387755104,
"acc_norm_stderr": 0.03175195237583322
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5174129353233831,
"acc_stderr": 0.03533389234739245,
"acc_norm": 0.5174129353233831,
"acc_norm_stderr": 0.03533389234739245
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.03789134424611548,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.03789134424611548
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.038057975055904594,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.038057975055904594
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.3664779193013505,
"mc2_stderr": 0.013657498594566674
},
"harness|winogrande|5": {
"acc": 0.7111286503551697,
"acc_stderr": 0.01273824127101844
},
"harness|gsm8k|5": {
"acc": 0.08339651250947688,
"acc_stderr": 0.00761565027710667
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mosaicml__mpt-7b-8k | [
"region:us"
] | 2024-01-16T14:45:51+00:00 | {"pretty_name": "Evaluation run of mosaicml/mpt-7b-8k", "dataset_summary": "Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-8k](https://huggingface.co/mosaicml/mpt-7b-8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b-8k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T14:43:48.109992](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-8k/blob/main/results_2024-01-16T14-43-48.109992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.42743057148449115,\n \"acc_stderr\": 0.03450500692223034,\n \"acc_norm\": 0.4325211129796156,\n \"acc_norm_stderr\": 0.03531856610209659,\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.3664779193013505,\n \"mc2_stderr\": 0.013657498594566674\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.43430034129692835,\n \"acc_stderr\": 0.01448470304885736,\n \"acc_norm\": 0.4735494880546075,\n \"acc_norm_stderr\": 0.014590931358120169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5732921728739295,\n \"acc_stderr\": 0.004935882666250478,\n \"acc_norm\": 0.7740489942242581,\n \"acc_norm_stderr\": 0.0041735236727608\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779205,\n \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779205\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500482,\n \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500482\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n \"acc_stderr\": 0.03724249595817729,\n \"acc_norm\": 0.3930635838150289,\n \"acc_norm_stderr\": 0.03724249595817729\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708614,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708614\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4806451612903226,\n \"acc_stderr\": 0.028422687404312107,\n \"acc_norm\": 0.4806451612903226,\n \"acc_norm_stderr\": 0.028422687404312107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5906735751295337,\n \"acc_stderr\": 0.03548608168860806,\n \"acc_norm\": 0.5906735751295337,\n \"acc_norm_stderr\": 0.03548608168860806\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.024468615241478912,\n \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.024468615241478912\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.03181110032413926,\n \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.03181110032413926\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5339449541284403,\n \"acc_stderr\": 0.021387863350353982,\n \"acc_norm\": 0.5339449541284403,\n \"acc_norm_stderr\": 0.021387863350353982\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329881,\n \"acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329881\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.034924061041636124,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.034924061041636124\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5907172995780591,\n \"acc_stderr\": 0.032007041833595914,\n \"acc_norm\": 0.5907172995780591,\n \"acc_norm_stderr\": 0.032007041833595914\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5041322314049587,\n \"acc_stderr\": 0.04564198767432754,\n \"acc_norm\": 0.5041322314049587,\n \"acc_norm_stderr\": 0.04564198767432754\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4539877300613497,\n \"acc_stderr\": 0.0391170190467718,\n \"acc_norm\": 0.4539877300613497,\n \"acc_norm_stderr\": 0.0391170190467718\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.4368932038834951,\n \"acc_stderr\": 0.04911147107365777,\n \"acc_norm\": 0.4368932038834951,\n \"acc_norm_stderr\": 0.04911147107365777\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5811965811965812,\n \"acc_stderr\": 0.03232128912157792,\n \"acc_norm\": 0.5811965811965812,\n \"acc_norm_stderr\": 0.03232128912157792\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5542784163473818,\n \"acc_stderr\": 0.0177742972824795,\n \"acc_norm\": 0.5542784163473818,\n \"acc_norm_stderr\": 0.0177742972824795\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4421965317919075,\n \"acc_stderr\": 0.0267386036438074,\n \"acc_norm\": 0.4421965317919075,\n \"acc_norm_stderr\": 0.0267386036438074\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.014487500852850407,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.014487500852850407\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.028555827516528777,\n \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.028555827516528777\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5112540192926045,\n \"acc_stderr\": 0.028390897396863526,\n \"acc_norm\": 0.5112540192926045,\n \"acc_norm_stderr\": 0.028390897396863526\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.02782074420373286,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.02782074420373286\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503807,\n \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503807\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3285528031290743,\n \"acc_stderr\": 0.011996027247502901,\n \"acc_norm\": 0.3285528031290743,\n \"acc_norm_stderr\": 0.011996027247502901\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.34191176470588236,\n \"acc_stderr\": 0.028814722422254177,\n \"acc_norm\": 0.34191176470588236,\n \"acc_norm_stderr\": 0.028814722422254177\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.020102583895887188,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.020102583895887188\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.4818181818181818,\n \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.43673469387755104,\n \"acc_stderr\": 0.03175195237583322,\n \"acc_norm\": 0.43673469387755104,\n \"acc_norm_stderr\": 0.03175195237583322\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5174129353233831,\n \"acc_stderr\": 0.03533389234739245,\n \"acc_norm\": 0.5174129353233831,\n \"acc_norm_stderr\": 0.03533389234739245\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.03789134424611548,\n \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.03789134424611548\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.038057975055904594,\n \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.038057975055904594\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.3664779193013505,\n \"mc2_stderr\": 0.013657498594566674\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7111286503551697,\n \"acc_stderr\": 0.01273824127101844\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08339651250947688,\n \"acc_stderr\": 0.00761565027710667\n }\n}\n```", "repo_url": "https://huggingface.co/mosaicml/mpt-7b-8k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-43-48.109992.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["**/details_harness|winogrande|5_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T14-43-48.109992.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T14_43_48.109992", "path": ["results_2024-01-16T14-43-48.109992.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T14-43-48.109992.parquet"]}]}]} | 2024-01-16T14:46:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mosaicml/mpt-7b-8k
Dataset automatically created during the evaluation run of model mosaicml/mpt-7b-8k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T14:43:48.109992(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mosaicml/mpt-7b-8k\n\n\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-8k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T14:43:48.109992(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mosaicml/mpt-7b-8k\n\n\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-8k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T14:43:48.109992(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
04a80a2cbfce648ce947db644a351544a5ce6874 |
# Dataset Card for Evaluation run of AIGeekLabs/radiantloom-mixtral-8x7b-fusion
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGeekLabs/radiantloom-mixtral-8x7b-fusion](https://huggingface.co/AIGeekLabs/radiantloom-mixtral-8x7b-fusion) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGeekLabs__radiantloom-mixtral-8x7b-fusion",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T18:14:22.936356](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGeekLabs__radiantloom-mixtral-8x7b-fusion/blob/main/results_2024-01-16T18-14-22.936356.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6025083495631576,
"acc_stderr": 0.033276409512043775,
"acc_norm": 0.6048474981986636,
"acc_norm_stderr": 0.03394556451708956,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591663,
"mc2": 0.5475857676636995,
"mc2_stderr": 0.0158714845716314
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.01433223630679015,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268804
},
"harness|hellaswag|10": {
"acc": 0.6456881099382593,
"acc_stderr": 0.004773267510112743,
"acc_norm": 0.8364867556263692,
"acc_norm_stderr": 0.003690774563638011
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05000000000000001,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05000000000000001
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531006,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531006
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.02811209121011746,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.02811209121011746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150027,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150027
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443135,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443135
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069425,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069425
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.014485656041669173,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.014485656041669173
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.016525425898773503,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.016525425898773503
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.027245613047215355,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.027245613047215355
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824087,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824087
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935729,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935729
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573702,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573702
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591663,
"mc2": 0.5475857676636995,
"mc2_stderr": 0.0158714845716314
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.01198854184484391
},
"harness|gsm8k|5": {
"acc": 0.5344958301743745,
"acc_stderr": 0.013739668147545916
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AIGeekLabs__radiantloom-mixtral-8x7b-fusion | [
"region:us"
] | 2024-01-16T14:48:05+00:00 | {"pretty_name": "Evaluation run of AIGeekLabs/radiantloom-mixtral-8x7b-fusion", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIGeekLabs/radiantloom-mixtral-8x7b-fusion](https://huggingface.co/AIGeekLabs/radiantloom-mixtral-8x7b-fusion) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGeekLabs__radiantloom-mixtral-8x7b-fusion\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T18:14:22.936356](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGeekLabs__radiantloom-mixtral-8x7b-fusion/blob/main/results_2024-01-16T18-14-22.936356.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6025083495631576,\n \"acc_stderr\": 0.033276409512043775,\n \"acc_norm\": 0.6048474981986636,\n \"acc_norm_stderr\": 0.03394556451708956,\n \"mc1\": 0.38555691554467564,\n \"mc1_stderr\": 0.017038839010591663,\n \"mc2\": 0.5475857676636995,\n \"mc2_stderr\": 0.0158714845716314\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.01433223630679015,\n \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268804\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6456881099382593,\n \"acc_stderr\": 0.004773267510112743,\n \"acc_norm\": 0.8364867556263692,\n \"acc_norm_stderr\": 0.003690774563638011\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05000000000000001,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05000000000000001\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370331,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370331\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531006,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531006\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011746,\n \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011746\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150027,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150027\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443135,\n \"acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443135\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7931034482758621,\n \"acc_stderr\": 0.014485656041669173,\n \"acc_norm\": 0.7931034482758621,\n \"acc_norm_stderr\": 0.014485656041669173\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n \"acc_stderr\": 0.016525425898773503,\n \"acc_norm\": 0.423463687150838,\n \"acc_norm_stderr\": 0.016525425898773503\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.027245613047215355,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.027245613047215355\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824087,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824087\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n \"acc_stderr\": 0.012669813464935729,\n \"acc_norm\": 0.43741851368970014,\n \"acc_norm_stderr\": 0.012669813464935729\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573702,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573702\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n \"mc1_stderr\": 0.017038839010591663,\n \"mc2\": 0.5475857676636995,\n \"mc2_stderr\": 0.0158714845716314\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.01198854184484391\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5344958301743745,\n \"acc_stderr\": 0.013739668147545916\n }\n}\n```", "repo_url": "https://huggingface.co/AIGeekLabs/radiantloom-mixtral-8x7b-fusion", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|arc:challenge|25_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|gsm8k|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hellaswag|10_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-45-36.248240.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T18-14-22.936356.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["**/details_harness|winogrande|5_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["**/details_harness|winogrande|5_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T18-14-22.936356.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T14_45_36.248240", "path": ["results_2024-01-16T14-45-36.248240.parquet"]}, {"split": "2024_01_16T18_14_22.936356", "path": ["results_2024-01-16T18-14-22.936356.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T18-14-22.936356.parquet"]}]}]} | 2024-01-16T18:16:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AIGeekLabs/radiantloom-mixtral-8x7b-fusion
Dataset automatically created during the evaluation run of model AIGeekLabs/radiantloom-mixtral-8x7b-fusion on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T18:14:22.936356(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AIGeekLabs/radiantloom-mixtral-8x7b-fusion\n\n\n\nDataset automatically created during the evaluation run of model AIGeekLabs/radiantloom-mixtral-8x7b-fusion on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T18:14:22.936356(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AIGeekLabs/radiantloom-mixtral-8x7b-fusion\n\n\n\nDataset automatically created during the evaluation run of model AIGeekLabs/radiantloom-mixtral-8x7b-fusion on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T18:14:22.936356(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0dad750b7b0b4da4e66f173e38bff20c99ebb7e4 |
# Dataset Card for Evaluation run of zhengr/NeuralPipe-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [zhengr/NeuralPipe-7B-slerp](https://huggingface.co/zhengr/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zhengr__NeuralPipe-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T14:47:04.829049](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__NeuralPipe-7B-slerp/blob/main/results_2024-01-16T14-47-04.829049.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6447169263171724,
"acc_stderr": 0.03211493893533018,
"acc_norm": 0.6450175117328331,
"acc_norm_stderr": 0.03277128130072703,
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332366,
"mc2": 0.5982418830210784,
"mc2_stderr": 0.01515275893598861
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693252
},
"harness|hellaswag|10": {
"acc": 0.6692889862577176,
"acc_stderr": 0.004695076629884537,
"acc_norm": 0.8611830312686716,
"acc_norm_stderr": 0.003450488042965012
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.02386800326250011,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.02386800326250011
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.358659217877095,
"acc_stderr": 0.016040454426164474,
"acc_norm": 0.358659217877095,
"acc_norm_stderr": 0.016040454426164474
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922438,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922438
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332366,
"mc2": 0.5982418830210784,
"mc2_stderr": 0.01515275893598861
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.011285013754047436
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.01270568572313171
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_zhengr__NeuralPipe-7B-slerp | [
"region:us"
] | 2024-01-16T14:49:23+00:00 | {"pretty_name": "Evaluation run of zhengr/NeuralPipe-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [zhengr/NeuralPipe-7B-slerp](https://huggingface.co/zhengr/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zhengr__NeuralPipe-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T14:47:04.829049](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__NeuralPipe-7B-slerp/blob/main/results_2024-01-16T14-47-04.829049.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6447169263171724,\n \"acc_stderr\": 0.03211493893533018,\n \"acc_norm\": 0.6450175117328331,\n \"acc_norm_stderr\": 0.03277128130072703,\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332366,\n \"mc2\": 0.5982418830210784,\n \"mc2_stderr\": 0.01515275893598861\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693252\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6692889862577176,\n \"acc_stderr\": 0.004695076629884537,\n \"acc_norm\": 0.8611830312686716,\n \"acc_norm_stderr\": 0.003450488042965012\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.358659217877095,\n \"acc_stderr\": 0.016040454426164474,\n \"acc_norm\": 0.358659217877095,\n \"acc_norm_stderr\": 0.016040454426164474\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n \"acc_stderr\": 0.012750151802922438,\n \"acc_norm\": 0.47196870925684486,\n \"acc_norm_stderr\": 0.012750151802922438\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332366,\n \"mc2\": 0.5982418830210784,\n \"mc2_stderr\": 0.01515275893598861\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047436\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \"acc_stderr\": 0.01270568572313171\n }\n}\n```", "repo_url": "https://huggingface.co/zhengr/NeuralPipe-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T14-47-04.829049.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["**/details_harness|winogrande|5_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T14-47-04.829049.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T14_47_04.829049", "path": ["results_2024-01-16T14-47-04.829049.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T14-47-04.829049.parquet"]}]}]} | 2024-01-16T14:49:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of zhengr/NeuralPipe-7B-slerp
Dataset automatically created during the evaluation run of model zhengr/NeuralPipe-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T14:47:04.829049(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of zhengr/NeuralPipe-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model zhengr/NeuralPipe-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T14:47:04.829049(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of zhengr/NeuralPipe-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model zhengr/NeuralPipe-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T14:47:04.829049(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0c09e8555a382ea43497089b2a03988b52af3ea9 | # Dataset Card for "0-10000-ultrafeedback-binarized-preferences-cleaned-ita"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | giux78/0-10000-ultrafeedback-binarized-preferences-cleaned-ita | [
"region:us"
] | 2024-01-16T15:42:34+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen-rating", "dtype": "float64"}, {"name": "chosen-model", "dtype": "string"}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected-rating", "dtype": "float64"}, {"name": "rejected-model", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 74602858, "num_examples": 10000}], "download_size": 35293053, "dataset_size": 74602858}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T15:42:39+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "0-10000-ultrafeedback-binarized-preferences-cleaned-ita"
More Information needed | [
"# Dataset Card for \"0-10000-ultrafeedback-binarized-preferences-cleaned-ita\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"0-10000-ultrafeedback-binarized-preferences-cleaned-ita\"\n\nMore Information needed"
] |
238b76a65f62516fa763295919cf68a8b727d66f |
<img src="https://huggingface.co/datasets/hkust-nlp/deita-images/resolve/main/logo-final.png" alt="Deita banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Dataset Card for Deita Complexity Scorer Training Data
[GitHub](https://github.com/hkust-nlp/deita) | [Paper](https://arxiv.org/abs/2312.15685)
Deita is an open-sourced project designed to facilitate **Automatic Data Selection** for instruction tuning in Large Language Models (LLMs).
This dataset includes data for training Deita Complexity Scorer.
**Model Family**: Other models and the dataset are found in the [Deita Collection](https://huggingface.co/collections/hkust-nlp/deita-6569c198c174808d94cf5bd4)
## Performance
| Model | Align | Data Size | MT-Bench | AlpacaEval(%) | OpenLLM (Avg.) |
|------------------------------------------------|-----------|------------|----------|---------------|----------------|
| **Proprietary Models** | | | | | |
| GPT-4-Turbo | ? | -- | 9.32 | 97.70 | -- |
| GPT-4 | SFT + PPO | -- | 8.99 | 95.03 | -- |
| Claude-2 | SFT + PPO | -- | 8.06 | 91.36 | -- |
| GPT-3.5-turbo | SFT + PPO | -- | 7.94 | 89.37 | -- |
| **Open-sourced Models based on LLaMA-1-13B** | | | | | |
| LIMA | SFT | 1K SFT | 4.29 | 41.98 | 59.82 |
| WizardLM-13B | SFT | 70K SFT | 6.35 | 75.31 | 58.96 |
| Vicuna-13B-v1.3 | SFT | 125K SFT | 6.39 | 82.11 | 60.01 |
| Random | SFT | 10K SFT | 6.03 | 71.52 | 60.14 |
| DEITA-LLaMA1-13B-v1.0-sft | SFT | 10K SFT | 6.60 | 78.01 | 64.27 |
| **Open-sourced Models based on LLaMA-2-13B** | | | | | |
| Tulu-2-13B | SFT | 326K SFT | 6.70 | 78.90 | -- |
| Tulu-2-13B+DPO | SFT + DPO | 326K SFT + 60K DPO | 7.00 | 89.50 | -- |
| LLaMA2-13B-Chat | SFT + PPO | -- | 6.65 | 81.09 | -- |
| WizardLM-13B-v1.2 | SFT | >70K SFT | 7.09 | 89.17 | -- |
| Vicuna-13B-v1.5 | SFT | 125K SFT | 6.57 | 78.80 | 61.63 |
| Random | SFT | 10K SFT | 5.78 | 65.19 | 61.32 |
| DEITA-LLaMA2-13B-v1.0-sft | SFT | 10K SFT | 6.79 | 81.09 | 62.71 |
| **Open-sourced Models based on Mistral-7B** | | | | | |
| Mistral-7B-Instruct-v0.1 | -- | -- | 6.84 | 69.65 | 60.45 |
| Zephyr-7B-sft | SFT | 200K SFT | 5.32 | 75.12 | 60.93 |
| $\text{Zephyr-7B-}\beta$ | SFT + DPO | 200K SFT + 60K DPO | 7.34 | 90.60 | 66.36 |
| OpenChat-3.5 | C-RLFT | >> 70K C-RLFT | 7.81 | 88.51 | -- |
| Starling-7B | C-RLFT + APA | >>70K C-RLFT + 183K APA | 8.09 | 91.99 | -- |
| Random | SFT | 10K SFT | 5.89 | 56.90 | 61.72 |
| DEITA-7B-v1.0-sft (6K) | SFT | 6K SFT | 7.22 | 80.78 | 64.94 |
| DEITA-7B-v1.0-sft (10K) | SFT | 10K SFT | 7.32 | 81.67 | 64.00 |
| DEITA-7B-v1.0 | SFT + DPO | 6K SFT + 10K DPO | 7.55 | 90.06 | 69.86 |
## Citation
If you find the content of this project helpful, please cite our paper as follows:
```
@misc{liu2023what,
title={What Makes Good Data for Alignment? A Comprehensive Study of Automatic Data Selection in Instruction Tuning},
author={Wei Liu and Weihao Zeng and Keqing He and Yong Jiang and Junxian He},
year={2023},
eprint={2312.15685},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | hkust-nlp/deita-complexity-scorer-data | [
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"arxiv:2312.15685",
"region:us"
] | 2024-01-16T15:42:42+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"]} | 2024-01-16T15:49:10+00:00 | [
"2312.15685"
] | [
"en"
] | TAGS
#size_categories-1K<n<10K #language-English #license-mit #arxiv-2312.15685 #region-us
| <img src="URL alt="Deita banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
Dataset Card for Deita Complexity Scorer Training Data
======================================================
GitHub | Paper
Deita is an open-sourced project designed to facilitate Automatic Data Selection for instruction tuning in Large Language Models (LLMs).
This dataset includes data for training Deita Complexity Scorer.
Model Family: Other models and the dataset are found in the Deita Collection
Performance
-----------
If you find the content of this project helpful, please cite our paper as follows:
| [] | [
"TAGS\n#size_categories-1K<n<10K #language-English #license-mit #arxiv-2312.15685 #region-us \n"
] |
e0201777f053a565806cd6b2c2ecf9e357acb319 | ## LitScan EPMC Subset
This dataset is a subset of [afg1/epmc-oa-subset](https://huggingface.co/datasets/afg1/epmc-oa-subset),
which itself comes from the [Europe PMC open access subset](https://europepmc.org/downloads/openaccess) of about 5.9 million articles.
Here, we take the ~960 parquet files from the full OA subset and join them against a list of PMCIDs for articles found by [LitScan](https://rnacentral.org/help/litscan),
which should discuss ncRNA for the ~9.6 million IDs searched from RNAcentral. The result is a collection of just over 1 million open access
fulltext articles ostensibly about ncRNA.
The primary use case for this is pre-finetuning on domain specific text. This idea of domain adaptation is similar to what
NVIDIA have done with their [ChipNeMo model](https://research.nvidia.com/publication/2023-10_chipnemo-domain-adapted-llms-chip-design).
We are planning to finetune some models on this dataset, probably TinyLlama, since it is quite quick to train.
These will be useful for e.g. generating embeddings for RAG, or further downstream finetuning on tasks like summarisation.
## Limitations
The epmc-oa-subset parquet files are parsed from JATS, which does not always go entirely to plan. As a result, there are likely to be some
articles with missing text, or strange tags left in. These should be quite rare, but I can't guarantee they're not in there.
LitScan itself also has some limitations, namely that there is quite a high false positive rate for those RNA IDs that are a bit generic. This
means that while most of the articles in this dataset should be focused on RNA, there will be a significant minority that are about all sorts of
other things, including but not limited to: concrete, female mice, recurrent neural networks. This is a very tricky problem to solve!
| afg1/litscan-epmc-subset | [
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:en",
"region:us"
] | 2024-01-16T15:49:59+00:00 | {"language": ["en"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation"]} | 2024-01-16T16:33:56+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-1M<n<10M #language-English #region-us
| ## LitScan EPMC Subset
This dataset is a subset of afg1/epmc-oa-subset,
which itself comes from the Europe PMC open access subset of about 5.9 million articles.
Here, we take the ~960 parquet files from the full OA subset and join them against a list of PMCIDs for articles found by LitScan,
which should discuss ncRNA for the ~9.6 million IDs searched from RNAcentral. The result is a collection of just over 1 million open access
fulltext articles ostensibly about ncRNA.
The primary use case for this is pre-finetuning on domain specific text. This idea of domain adaptation is similar to what
NVIDIA have done with their ChipNeMo model.
We are planning to finetune some models on this dataset, probably TinyLlama, since it is quite quick to train.
These will be useful for e.g. generating embeddings for RAG, or further downstream finetuning on tasks like summarisation.
## Limitations
The epmc-oa-subset parquet files are parsed from JATS, which does not always go entirely to plan. As a result, there are likely to be some
articles with missing text, or strange tags left in. These should be quite rare, but I can't guarantee they're not in there.
LitScan itself also has some limitations, namely that there is quite a high false positive rate for those RNA IDs that are a bit generic. This
means that while most of the articles in this dataset should be focused on RNA, there will be a significant minority that are about all sorts of
other things, including but not limited to: concrete, female mice, recurrent neural networks. This is a very tricky problem to solve!
| [
"## LitScan EPMC Subset\n\nThis dataset is a subset of afg1/epmc-oa-subset, \nwhich itself comes from the Europe PMC open access subset of about 5.9 million articles.\n\nHere, we take the ~960 parquet files from the full OA subset and join them against a list of PMCIDs for articles found by LitScan,\nwhich should discuss ncRNA for the ~9.6 million IDs searched from RNAcentral. The result is a collection of just over 1 million open access\nfulltext articles ostensibly about ncRNA.\n\nThe primary use case for this is pre-finetuning on domain specific text. This idea of domain adaptation is similar to what \nNVIDIA have done with their ChipNeMo model.\n\nWe are planning to finetune some models on this dataset, probably TinyLlama, since it is quite quick to train. \nThese will be useful for e.g. generating embeddings for RAG, or further downstream finetuning on tasks like summarisation.",
"## Limitations\n\nThe epmc-oa-subset parquet files are parsed from JATS, which does not always go entirely to plan. As a result, there are likely to be some \narticles with missing text, or strange tags left in. These should be quite rare, but I can't guarantee they're not in there.\n\nLitScan itself also has some limitations, namely that there is quite a high false positive rate for those RNA IDs that are a bit generic. This\nmeans that while most of the articles in this dataset should be focused on RNA, there will be a significant minority that are about all sorts of\nother things, including but not limited to: concrete, female mice, recurrent neural networks. This is a very tricky problem to solve!"
] | [
"TAGS\n#task_categories-text-generation #size_categories-1M<n<10M #language-English #region-us \n",
"## LitScan EPMC Subset\n\nThis dataset is a subset of afg1/epmc-oa-subset, \nwhich itself comes from the Europe PMC open access subset of about 5.9 million articles.\n\nHere, we take the ~960 parquet files from the full OA subset and join them against a list of PMCIDs for articles found by LitScan,\nwhich should discuss ncRNA for the ~9.6 million IDs searched from RNAcentral. The result is a collection of just over 1 million open access\nfulltext articles ostensibly about ncRNA.\n\nThe primary use case for this is pre-finetuning on domain specific text. This idea of domain adaptation is similar to what \nNVIDIA have done with their ChipNeMo model.\n\nWe are planning to finetune some models on this dataset, probably TinyLlama, since it is quite quick to train. \nThese will be useful for e.g. generating embeddings for RAG, or further downstream finetuning on tasks like summarisation.",
"## Limitations\n\nThe epmc-oa-subset parquet files are parsed from JATS, which does not always go entirely to plan. As a result, there are likely to be some \narticles with missing text, or strange tags left in. These should be quite rare, but I can't guarantee they're not in there.\n\nLitScan itself also has some limitations, namely that there is quite a high false positive rate for those RNA IDs that are a bit generic. This\nmeans that while most of the articles in this dataset should be focused on RNA, there will be a significant minority that are about all sorts of\nother things, including but not limited to: concrete, female mice, recurrent neural networks. This is a very tricky problem to solve!"
] |
3e570ef384614072d48bc0b1d9643b1b050951ed |
<img src="https://huggingface.co/datasets/hkust-nlp/deita-images/resolve/main/logo-final.png" alt="Deita banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Dataset Card for Deita Quality Scorer Training Data
[GitHub](https://github.com/hkust-nlp/deita) | [Paper](https://arxiv.org/abs/2312.15685)
Deita is an open-sourced project designed to facilitate **Automatic Data Selection** for instruction tuning in Large Language Models (LLMs).
This dataset includes data for training Deita Quality Scorer.
**Model Family**: Other models and the dataset are found in the [Deita Collection](https://huggingface.co/collections/hkust-nlp/deita-6569c198c174808d94cf5bd4)
## Performance
| Model | Align | Data Size | MT-Bench | AlpacaEval(%) | OpenLLM (Avg.) |
|------------------------------------------------|-----------|------------|----------|---------------|----------------|
| **Proprietary Models** | | | | | |
| GPT-4-Turbo | ? | -- | 9.32 | 97.70 | -- |
| GPT-4 | SFT + PPO | -- | 8.99 | 95.03 | -- |
| Claude-2 | SFT + PPO | -- | 8.06 | 91.36 | -- |
| GPT-3.5-turbo | SFT + PPO | -- | 7.94 | 89.37 | -- |
| **Open-sourced Models based on LLaMA-1-13B** | | | | | |
| LIMA | SFT | 1K SFT | 4.29 | 41.98 | 59.82 |
| WizardLM-13B | SFT | 70K SFT | 6.35 | 75.31 | 58.96 |
| Vicuna-13B-v1.3 | SFT | 125K SFT | 6.39 | 82.11 | 60.01 |
| Random | SFT | 10K SFT | 6.03 | 71.52 | 60.14 |
| DEITA-LLaMA1-13B-v1.0-sft | SFT | 10K SFT | 6.60 | 78.01 | 64.27 |
| **Open-sourced Models based on LLaMA-2-13B** | | | | | |
| Tulu-2-13B | SFT | 326K SFT | 6.70 | 78.90 | -- |
| Tulu-2-13B+DPO | SFT + DPO | 326K SFT + 60K DPO | 7.00 | 89.50 | -- |
| LLaMA2-13B-Chat | SFT + PPO | -- | 6.65 | 81.09 | -- |
| WizardLM-13B-v1.2 | SFT | >70K SFT | 7.09 | 89.17 | -- |
| Vicuna-13B-v1.5 | SFT | 125K SFT | 6.57 | 78.80 | 61.63 |
| Random | SFT | 10K SFT | 5.78 | 65.19 | 61.32 |
| DEITA-LLaMA2-13B-v1.0-sft | SFT | 10K SFT | 6.79 | 81.09 | 62.71 |
| **Open-sourced Models based on Mistral-7B** | | | | | |
| Mistral-7B-Instruct-v0.1 | -- | -- | 6.84 | 69.65 | 60.45 |
| Zephyr-7B-sft | SFT | 200K SFT | 5.32 | 75.12 | 60.93 |
| $\text{Zephyr-7B-}\beta$ | SFT + DPO | 200K SFT + 60K DPO | 7.34 | 90.60 | 66.36 |
| OpenChat-3.5 | C-RLFT | >> 70K C-RLFT | 7.81 | 88.51 | -- |
| Starling-7B | C-RLFT + APA | >>70K C-RLFT + 183K APA | 8.09 | 91.99 | -- |
| Random | SFT | 10K SFT | 5.89 | 56.90 | 61.72 |
| DEITA-7B-v1.0-sft (6K) | SFT | 6K SFT | 7.22 | 80.78 | 64.94 |
| DEITA-7B-v1.0-sft (10K) | SFT | 10K SFT | 7.32 | 81.67 | 64.00 |
| DEITA-7B-v1.0 | SFT + DPO | 6K SFT + 10K DPO | 7.55 | 90.06 | 69.86 |
## Citation
If you find the content of this project helpful, please cite our paper as follows:
```
@misc{liu2023what,
title={What Makes Good Data for Alignment? A Comprehensive Study of Automatic Data Selection in Instruction Tuning},
author={Wei Liu and Weihao Zeng and Keqing He and Yong Jiang and Junxian He},
year={2023},
eprint={2312.15685},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | hkust-nlp/deita-quality-scorer-data | [
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"arxiv:2312.15685",
"region:us"
] | 2024-01-16T15:52:14+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"]} | 2024-01-16T15:55:17+00:00 | [
"2312.15685"
] | [
"en"
] | TAGS
#size_categories-1K<n<10K #language-English #license-mit #arxiv-2312.15685 #region-us
| <img src="URL alt="Deita banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
Dataset Card for Deita Quality Scorer Training Data
===================================================
GitHub | Paper
Deita is an open-sourced project designed to facilitate Automatic Data Selection for instruction tuning in Large Language Models (LLMs).
This dataset includes data for training Deita Quality Scorer.
Model Family: Other models and the dataset are found in the Deita Collection
Performance
-----------
If you find the content of this project helpful, please cite our paper as follows:
| [] | [
"TAGS\n#size_categories-1K<n<10K #language-English #license-mit #arxiv-2312.15685 #region-us \n"
] |
fc6a4db3493b302b2546dbd7afbe57bef36d238c | # Dataset Card for "CoT_reformatted"
This dataset is reformatted from: QingyiSi/Alpaca-CoT
All credit goes there. Thanks to QingyiSi for the work in consolidating many diverse sources for comparison and cross-file analysis.
There were some issues loading files from that dataset for a testing project.
I extracted the following data files for this subset:
- alpaca_data_cleaned
- CoT_data
- firefly
- instruct
- alpaca_gpt4_data
- dolly
- GPTeacher
- thoughtsource
- finance_en
- instinwild_en
| jtatman/CoT_reformatted | [
"task_categories:text-generation",
"task_categories:question-answering",
"size_categories:1M<n<10M",
"language:en",
"language:zh",
"license:apache-2.0",
"cot",
"conversational",
"region:us"
] | 2024-01-16T16:36:25+00:00 | {"language": ["en", "zh"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "question-answering"], "pretty_name": "cot reformatted", "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "id", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 2487281447, "num_examples": 3229975}], "download_size": 1513934252, "dataset_size": 2487281447}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["cot", "conversational"]} | 2024-01-16T17:43:49+00:00 | [] | [
"en",
"zh"
] | TAGS
#task_categories-text-generation #task_categories-question-answering #size_categories-1M<n<10M #language-English #language-Chinese #license-apache-2.0 #cot #conversational #region-us
| # Dataset Card for "CoT_reformatted"
This dataset is reformatted from: QingyiSi/Alpaca-CoT
All credit goes there. Thanks to QingyiSi for the work in consolidating many diverse sources for comparison and cross-file analysis.
There were some issues loading files from that dataset for a testing project.
I extracted the following data files for this subset:
- alpaca_data_cleaned
- CoT_data
- firefly
- instruct
- alpaca_gpt4_data
- dolly
- GPTeacher
- thoughtsource
- finance_en
- instinwild_en
| [
"# Dataset Card for \"CoT_reformatted\"\n\nThis dataset is reformatted from: QingyiSi/Alpaca-CoT\n\nAll credit goes there. Thanks to QingyiSi for the work in consolidating many diverse sources for comparison and cross-file analysis.\n\nThere were some issues loading files from that dataset for a testing project. \n\nI extracted the following data files for this subset:\n\n- alpaca_data_cleaned\n- CoT_data\n- firefly \n- instruct\n- alpaca_gpt4_data\n- dolly \n- GPTeacher\n- thoughtsource\n- finance_en\n- instinwild_en"
] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-1M<n<10M #language-English #language-Chinese #license-apache-2.0 #cot #conversational #region-us \n",
"# Dataset Card for \"CoT_reformatted\"\n\nThis dataset is reformatted from: QingyiSi/Alpaca-CoT\n\nAll credit goes there. Thanks to QingyiSi for the work in consolidating many diverse sources for comparison and cross-file analysis.\n\nThere were some issues loading files from that dataset for a testing project. \n\nI extracted the following data files for this subset:\n\n- alpaca_data_cleaned\n- CoT_data\n- firefly \n- instruct\n- alpaca_gpt4_data\n- dolly \n- GPTeacher\n- thoughtsource\n- finance_en\n- instinwild_en"
] |
7076bdabc177f92c8527ccd9dfff0c7fe815ce2e | # Dataset Card for "cai-conversation-dev1705423021"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/cai-conversation-dev1705423021 | [
"region:us"
] | 2024-01-16T16:53:38+00:00 | {"dataset_info": {"features": [{"name": "index", "dtype": "int64"}, {"name": "prompt", "dtype": "string"}, {"name": "init_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "init_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "critic_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "critic_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "revision_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "revision_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "test_sft", "num_bytes": 3639403, "num_examples": 1156}, {"name": "test_prefs", "num_bytes": 3560662, "num_examples": 1156}], "download_size": 3007425, "dataset_size": 7200065}, "configs": [{"config_name": "default", "data_files": [{"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]} | 2024-01-16T16:53:40+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cai-conversation-dev1705423021"
More Information needed | [
"# Dataset Card for \"cai-conversation-dev1705423021\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cai-conversation-dev1705423021\"\n\nMore Information needed"
] |
504de1be1fa488aa9973351f53218f657e83f5da | # Dataset Card for "FineTuneDataset512FS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Lollitor/FineTuneDataset512FS | [
"region:us"
] | 2024-01-16T17:03:46+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "sequence", "dtype": "string"}, {"name": "label", "dtype": "float64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 7223777, "num_examples": 10096}, {"name": "validation", "num_bytes": 800695, "num_examples": 1122}], "download_size": 3991423, "dataset_size": 8024472}} | 2024-01-16T17:03:50+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "FineTuneDataset512FS"
More Information needed | [
"# Dataset Card for \"FineTuneDataset512FS\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"FineTuneDataset512FS\"\n\nMore Information needed"
] |
037c59722b8acf81cec4e79e01611ef2f93af647 |
This is a sharded version of the [PocketDoc/ConversationChronicles-sharegpt](https://huggingface.co/datasets/PocketDoc/ConversationChronicles-sharegpt) dataset, a sharegpt conversion of the [jihyoung/ConversationChronicles](https://huggingface.co/datasets/jihyoung/ConversationChronicles) dataset.
All dialogue got fixed (space, coma) and spread across the different relationship available :
| Relationship | Count | Ratio |
| ------------------- | ------- | ----- |
| Classmates | 66,090 | 33.05% |
| Neighbors | 49,521 | 24.76% |
| Co-workers | 28,856 | 14.43% |
| Mentee and Mentor | 16,035 | 8.02% |
| Husband and Wife | 13,486 | 6.74% |
| Patient and Doctor | 6,980 | 3.49% |
| Parent and Child | 6,514 | 3.26% |
| Student and Teacher | 5,018 | 2.51% |
| Employee and Boss | 4,811 | 2.41% |
| Athlete and Coach | 2,689 | 1.34% |
| Total | 200,000* | |
*Count can be a less or more due to cleaning and different formatting.
Episodes ID linked to the relationship available from the OG dataset can be seen [HERE](https://huggingface.co/datasets/Undi95/ConversationChronicles-sharegpt-SHARDED/raw/main/episodes.txt) | Undi95/ConversationChronicles-sharegpt-SHARDED | [
"task_categories:conversational",
"language:en",
"license:cc-by-4.0",
"region:us"
] | 2024-01-16T17:18:23+00:00 | {"language": ["en"], "license": "cc-by-4.0", "task_categories": ["conversational"], "pretty_name": "CC"} | 2024-01-16T18:58:08+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #language-English #license-cc-by-4.0 #region-us
| This is a sharded version of the PocketDoc/ConversationChronicles-sharegpt dataset, a sharegpt conversion of the jihyoung/ConversationChronicles dataset.
All dialogue got fixed (space, coma) and spread across the different relationship available :
Relationship: Classmates, Count: 66,090, Ratio: 33.05%
Relationship: Neighbors, Count: 49,521, Ratio: 24.76%
Relationship: Co-workers, Count: 28,856, Ratio: 14.43%
Relationship: Mentee and Mentor, Count: 16,035, Ratio: 8.02%
Relationship: Husband and Wife, Count: 13,486, Ratio: 6.74%
Relationship: Patient and Doctor, Count: 6,980, Ratio: 3.49%
Relationship: Parent and Child, Count: 6,514, Ratio: 3.26%
Relationship: Student and Teacher, Count: 5,018, Ratio: 2.51%
Relationship: Employee and Boss, Count: 4,811, Ratio: 2.41%
Relationship: Athlete and Coach, Count: 2,689, Ratio: 1.34%
Relationship: Total, Count: 200,000\*, Ratio:
\*Count can be a less or more due to cleaning and different formatting.
Episodes ID linked to the relationship available from the OG dataset can be seen HERE
| [] | [
"TAGS\n#task_categories-conversational #language-English #license-cc-by-4.0 #region-us \n"
] |
c8e5de4ce5480d98448b0589579773cfa9868f6a |
<div dir="rtl">
### بطاقة مجموعة البيانات لـ "لا روبوت" 🙅♂️🤖
#### ملخص
"لا روبوتات" هي مجموعة بيانات تتكون من 10000 تعليمة وعرض، تم إنشاؤها بواسطة ملصقين محترفين. تمت ترجمتها باستخدام Google Cloud Platform Translation API. يمكن استخدام هذه المجموعة لتدريب نماذج اللغة على تتبع التعليمات بشكل أفضل (تدريب دقيق موجه بالتعليمات - SFT). تم إنشاء مجموعة "لا روبوتات" استنادًا إلى مجموعة البيانات الموصوفة في ورقة OpenAI's [InstructGPT](https://huggingface.co/papers/2203.02155)، وتشمل التصنيفات التالية:
</div>
| الفئة | العدد |
|:------------|------:|
| الإنشاء | 4560 |
| الأسئلة المفتوحة | 1240 |
| العصف الذهني | 1120 |
| الدردشة | 850 |
| إعادة الكتابة | 660 |
| الخلاصة | 420 |
| البرمجة | 350 |
| التصنيف | 350 |
| الأسئلة المغلقة | 260 |
| الاستخراج | 190 |
<div dir="rtl">
#### اللغات
تتوفر مجموعة البيانات هذه على اللغة العربية فقط. يمكن العثور على النسخة الأصلية باللغة الإنجليزية على [هذا الرابط](https://huggingface.co/datasets/HuggingFaceH4/no_robots)، والنسخة التركية على [هذا الرابط](https://huggingface.co/datasets/merve/tr-h4-norobots).
#### حقول البيانات
الأعمدة كالتالي:
* `prompt`: يحدد التعليمة التي يجب أن يتبعها النموذج.
* `prompt_id`: معرف فريد.
* `messages`: قائمة تحتوي على قواميس، كل قاموس يصف رسالة (key: content) ومن أرسلها (key: role).
* `category`: فئة المهمة، لم أقم بترجمة هذا.
#### التقسيمات
</div>
| | train | test |
|-------------------|----------:|--------:|
| لا روبوتات | 9500 | 500 |
<div dir="rtl">
#### الترخيص
مجموعة البيانات متاحة تحت رخصة [(CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
#### معلومات الاقتباس
</div>
```
@misc{no_robots,
author = {Nazneen Rajani and Lewis Tunstall and Edward Beeching and Nathan Lambert and Alexander M. Rush and Thomas Wolf},
title = {No Robots},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/datasets/HuggingFaceH4/no_robots}}
}
``` | 2A2I/H4_no_robots | [
"task_categories:conversational",
"task_categories:text-generation",
"language:ar",
"license:cc-by-nc-4.0",
"arxiv:2203.02155",
"region:us"
] | 2024-01-16T17:18:31+00:00 | {"language": ["ar"], "license": "cc-by-nc-4.0", "task_categories": ["conversational", "text-generation"], "pretty_name": "\u0644\u0627 \u0631\u0648\u0628\u0648\u062a\u0627\u062a", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 16496867, "num_examples": 9500}, {"name": "test", "num_bytes": 887460, "num_examples": 500}], "download_size": 11045465, "dataset_size": 17384327}} | 2024-01-30T12:17:50+00:00 | [
"2203.02155"
] | [
"ar"
] | TAGS
#task_categories-conversational #task_categories-text-generation #language-Arabic #license-cc-by-nc-4.0 #arxiv-2203.02155 #region-us
|
### بطاقة مجموعة البيانات لـ "لا روبوت" ️
#### ملخص
"لا روبوتات" هي مجموعة بيانات تتكون من 10000 تعليمة وعرض، تم إنشاؤها بواسطة ملصقين محترفين. تمت ترجمتها باستخدام Google Cloud Platform Translation API. يمكن استخدام هذه المجموعة لتدريب نماذج اللغة على تتبع التعليمات بشكل أفضل (تدريب دقيق موجه بالتعليمات - SFT). تم إنشاء مجموعة "لا روبوتات" استنادًا إلى مجموعة البيانات الموصوفة في ورقة OpenAI's InstructGPT، وتشمل التصنيفات التالية:
#### اللغات
تتوفر مجموعة البيانات هذه على اللغة العربية فقط. يمكن العثور على النسخة الأصلية باللغة الإنجليزية على هذا الرابط، والنسخة التركية على هذا الرابط.
#### حقول البيانات
الأعمدة كالتالي:
* 'prompt': يحدد التعليمة التي يجب أن يتبعها النموذج.
* 'prompt\_id': معرف فريد.
* 'messages': قائمة تحتوي على قواميس، كل قاموس يصف رسالة (key: content) ومن أرسلها (key: role).
* 'category': فئة المهمة، لم أقم بترجمة هذا.
#### التقسيمات
#### الترخيص
مجموعة البيانات متاحة تحت رخصة (CC BY-NC 4.0).
#### معلومات الاقتباس
| [
"### بطاقة مجموعة البيانات لـ \"لا روبوت\" ️",
"#### ملخص\n\n\n\"لا روبوتات\" هي مجموعة بيانات تتكون من 10000 تعليمة وعرض، تم إنشاؤها بواسطة ملصقين محترفين. تمت ترجمتها باستخدام Google Cloud Platform Translation API. يمكن استخدام هذه المجموعة لتدريب نماذج اللغة على تتبع التعليمات بشكل أفضل (تدريب دقيق موجه بالتعليمات - SFT). تم إنشاء مجموعة \"لا روبوتات\" استنادًا إلى مجموعة البيانات الموصوفة في ورقة OpenAI's InstructGPT، وتشمل التصنيفات التالية:",
"#### اللغات\n\n\nتتوفر مجموعة البيانات هذه على اللغة العربية فقط. يمكن العثور على النسخة الأصلية باللغة الإنجليزية على هذا الرابط، والنسخة التركية على هذا الرابط.",
"#### حقول البيانات\n\n\nالأعمدة كالتالي:\n\n\n* 'prompt': يحدد التعليمة التي يجب أن يتبعها النموذج.\n* 'prompt\\_id': معرف فريد.\n* 'messages': قائمة تحتوي على قواميس، كل قاموس يصف رسالة (key: content) ومن أرسلها (key: role).\n* 'category': فئة المهمة، لم أقم بترجمة هذا.",
"#### التقسيمات",
"#### الترخيص\n\n\nمجموعة البيانات متاحة تحت رخصة (CC BY-NC 4.0).",
"#### معلومات الاقتباس"
] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #language-Arabic #license-cc-by-nc-4.0 #arxiv-2203.02155 #region-us \n",
"### بطاقة مجموعة البيانات لـ \"لا روبوت\" ️",
"#### ملخص\n\n\n\"لا روبوتات\" هي مجموعة بيانات تتكون من 10000 تعليمة وعرض، تم إنشاؤها بواسطة ملصقين محترفين. تمت ترجمتها باستخدام Google Cloud Platform Translation API. يمكن استخدام هذه المجموعة لتدريب نماذج اللغة على تتبع التعليمات بشكل أفضل (تدريب دقيق موجه بالتعليمات - SFT). تم إنشاء مجموعة \"لا روبوتات\" استنادًا إلى مجموعة البيانات الموصوفة في ورقة OpenAI's InstructGPT، وتشمل التصنيفات التالية:",
"#### اللغات\n\n\nتتوفر مجموعة البيانات هذه على اللغة العربية فقط. يمكن العثور على النسخة الأصلية باللغة الإنجليزية على هذا الرابط، والنسخة التركية على هذا الرابط.",
"#### حقول البيانات\n\n\nالأعمدة كالتالي:\n\n\n* 'prompt': يحدد التعليمة التي يجب أن يتبعها النموذج.\n* 'prompt\\_id': معرف فريد.\n* 'messages': قائمة تحتوي على قواميس، كل قاموس يصف رسالة (key: content) ومن أرسلها (key: role).\n* 'category': فئة المهمة، لم أقم بترجمة هذا.",
"#### التقسيمات",
"#### الترخيص\n\n\nمجموعة البيانات متاحة تحت رخصة (CC BY-NC 4.0).",
"#### معلومات الاقتباس"
] |
4305e07d2887b7f56671d80b759201fbe75e5fa6 | # Results for [*Keypoint-based Stereophotoclinometry for Characterizing and Navigating Small Bodies: A Factor Graph Approach*](https://arc.aiaa.org/doi/abs/10.2514/6.2024-0513) presented at the 2024 AIAA SciTech Forum
<a href="https://imgur.com/Zrvpxwe"><img src="https://i.imgur.com/Zrvpxwe.gif" title="source: imgur.com" /></a>
See [example.ipynb](https://huggingface.co/datasets/travisdriver/spc-factor-results/blob/main/example.ipynb) for instructions on loading and manipulating the reconstructions.
If you utilize our reconstructions or data, please [our paper](https://arc.aiaa.org/doi/abs/10.2514/6.2024-0513):
```bibtex
@inproceedings{driver2023spc,
title={Keypoint-based Stereophotoclinometry for Characterizing and Navigating Small Bodies: A Factor Graph Approach},
author={Driver, Travis and Vaughan, Andrew and Cheng, Yang and Ansar, Adnan and Christian, John and Tsiotras, Panagiotis},
booktitle={AIAA SciTech Forum},
pages={1-25},
address={Orlando, FL, USA},
month={January},
year={2024},
}
```
| travisdriver/spc-factor-results | [
"region:us"
] | 2024-01-16T17:21:46+00:00 | {"pretty_name": "spc-factor-results", "viewer": false} | 2024-01-18T15:32:23+00:00 | [] | [] | TAGS
#region-us
| # Results for *Keypoint-based Stereophotoclinometry for Characterizing and Navigating Small Bodies: A Factor Graph Approach* presented at the 2024 AIAA SciTech Forum
<a href="URL src="https://i.URL title="source: URL" /></a>
See URL for instructions on loading and manipulating the reconstructions.
If you utilize our reconstructions or data, please our paper:
| [
"# Results for *Keypoint-based Stereophotoclinometry for Characterizing and Navigating Small Bodies: A Factor Graph Approach* presented at the 2024 AIAA SciTech Forum\n\n<a href=\"URL src=\"https://i.URL title=\"source: URL\" /></a>\n\nSee URL for instructions on loading and manipulating the reconstructions.\n\nIf you utilize our reconstructions or data, please our paper:"
] | [
"TAGS\n#region-us \n",
"# Results for *Keypoint-based Stereophotoclinometry for Characterizing and Navigating Small Bodies: A Factor Graph Approach* presented at the 2024 AIAA SciTech Forum\n\n<a href=\"URL src=\"https://i.URL title=\"source: URL\" /></a>\n\nSee URL for instructions on loading and manipulating the reconstructions.\n\nIf you utilize our reconstructions or data, please our paper:"
] |
a0fbe9bfd98708a10a6d4350e9064084048c9659 |
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP](https://huggingface.co/decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_decruz07__kellemar-DPO-Orca-Distilled-7B-SLERP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T17:19:43.357839](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-Orca-Distilled-7B-SLERP/blob/main/results_2024-01-16T17-19-43.357839.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6581904352779381,
"acc_stderr": 0.03198559228658252,
"acc_norm": 0.6579223801812178,
"acc_norm_stderr": 0.03264927345286248,
"mc1": 0.4944920440636475,
"mc1_stderr": 0.01750243899045106,
"mc2": 0.6497116450794354,
"mc2_stderr": 0.015124463559805741
},
"harness|arc:challenge|25": {
"acc": 0.6766211604095563,
"acc_stderr": 0.013669421630012134,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.013329750293382318
},
"harness|hellaswag|10": {
"acc": 0.6964748058155746,
"acc_stderr": 0.004588403419449666,
"acc_norm": 0.8756223859788886,
"acc_norm_stderr": 0.003293374019781595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.0399926287661772,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.0399926287661772
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.01654240195463191,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.01654240195463191
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533126,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533126
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4944920440636475,
"mc1_stderr": 0.01750243899045106,
"mc2": 0.6497116450794354,
"mc2_stderr": 0.015124463559805741
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613988
},
"harness|gsm8k|5": {
"acc": 0.7202426080363912,
"acc_stderr": 0.012364384016735319
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_decruz07__kellemar-DPO-Orca-Distilled-7B-SLERP | [
"region:us"
] | 2024-01-16T17:22:02+00:00 | {"pretty_name": "Evaluation run of decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP", "dataset_summary": "Dataset automatically created during the evaluation run of model [decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP](https://huggingface.co/decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decruz07__kellemar-DPO-Orca-Distilled-7B-SLERP\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T17:19:43.357839](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__kellemar-DPO-Orca-Distilled-7B-SLERP/blob/main/results_2024-01-16T17-19-43.357839.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6581904352779381,\n \"acc_stderr\": 0.03198559228658252,\n \"acc_norm\": 0.6579223801812178,\n \"acc_norm_stderr\": 0.03264927345286248,\n \"mc1\": 0.4944920440636475,\n \"mc1_stderr\": 0.01750243899045106,\n \"mc2\": 0.6497116450794354,\n \"mc2_stderr\": 0.015124463559805741\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6766211604095563,\n \"acc_stderr\": 0.013669421630012134,\n \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.013329750293382318\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6964748058155746,\n \"acc_stderr\": 0.004588403419449666,\n \"acc_norm\": 0.8756223859788886,\n \"acc_norm_stderr\": 0.003293374019781595\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.0399926287661772,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.0399926287661772\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n \"acc_stderr\": 0.01654240195463191,\n \"acc_norm\": 0.42681564245810055,\n \"acc_norm_stderr\": 0.01654240195463191\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4944920440636475,\n \"mc1_stderr\": 0.01750243899045106,\n \"mc2\": 0.6497116450794354,\n \"mc2_stderr\": 0.015124463559805741\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613988\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7202426080363912,\n \"acc_stderr\": 0.012364384016735319\n }\n}\n```", "repo_url": "https://huggingface.co/decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|arc:challenge|25_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|gsm8k|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hellaswag|10_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T17-19-43.357839.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["**/details_harness|winogrande|5_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T17-19-43.357839.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T17_19_43.357839", "path": ["results_2024-01-16T17-19-43.357839.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T17-19-43.357839.parquet"]}]}]} | 2024-01-16T17:22:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP
Dataset automatically created during the evaluation run of model decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T17:19:43.357839(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T17:19:43.357839(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP\n\n\n\nDataset automatically created during the evaluation run of model decruz07/kellemar-DPO-Orca-Distilled-7B-SLERP on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T17:19:43.357839(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f8c1e0789dd72d9620631332208718ed7b2f9bb4 | # Dataset Card for "WCEP-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mtc/WCEP-filtered | [
"region:us"
] | 2024-01-16T17:38:44+00:00 | {"dataset_info": {"features": [{"name": "document", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10823988, "num_examples": 370}], "download_size": 5149647, "dataset_size": 10823988}} | 2024-01-17T10:48:22+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "WCEP-filtered"
More Information needed | [
"# Dataset Card for \"WCEP-filtered\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"WCEP-filtered\"\n\nMore Information needed"
] |
2ad19b8307bf669651c84fad7cfdf47b7c9230e1 |
# Dataset Card for Evaluation run of KnutJaegersberg/Nanbeige-16B-Base-32K-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Nanbeige-16B-Base-32K-llama](https://huggingface.co/KnutJaegersberg/Nanbeige-16B-Base-32K-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Nanbeige-16B-Base-32K-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T17:54:07.755069](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Nanbeige-16B-Base-32K-llama/blob/main/results_2024-01-16T17-54-07.755069.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4511351260863433,
"acc_stderr": 0.03174859727422269,
"acc_norm": 0.4577622215400404,
"acc_norm_stderr": 0.03255795969436768,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.43430034129692835,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.4761092150170648,
"acc_norm_stderr": 0.014594701798071654
},
"harness|hellaswag|10": {
"acc": 0.5440151364270066,
"acc_stderr": 0.004970410081009455,
"acc_norm": 0.7308305118502291,
"acc_norm_stderr": 0.004426217654917996
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325583,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325583
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936338,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936338
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178277,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178277
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.02521731518484649,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.02521731518484649
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.02308663508684141,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.02308663508684141
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306386,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306386
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438888,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159624,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159624
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.0274666102131401,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.0274666102131401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5864197530864198,
"acc_stderr": 0.02740204204026996,
"acc_norm": 0.5864197530864198,
"acc_norm_stderr": 0.02740204204026996
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5179738562091504,
"acc_stderr": 0.020214761037872404,
"acc_norm": 0.5179738562091504,
"acc_norm_stderr": 0.020214761037872404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.7292817679558011,
"acc_stderr": 0.012487904760626304
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.0028227133223877035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Nanbeige-16B-Base-32K-llama | [
"region:us"
] | 2024-01-16T17:56:28+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Nanbeige-16B-Base-32K-llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Nanbeige-16B-Base-32K-llama](https://huggingface.co/KnutJaegersberg/Nanbeige-16B-Base-32K-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Nanbeige-16B-Base-32K-llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T17:54:07.755069](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Nanbeige-16B-Base-32K-llama/blob/main/results_2024-01-16T17-54-07.755069.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4511351260863433,\n \"acc_stderr\": 0.03174859727422269,\n \"acc_norm\": 0.4577622215400404,\n \"acc_norm_stderr\": 0.03255795969436768,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.43430034129692835,\n \"acc_stderr\": 0.01448470304885736,\n \"acc_norm\": 0.4761092150170648,\n \"acc_norm_stderr\": 0.014594701798071654\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5440151364270066,\n \"acc_stderr\": 0.004970410081009455,\n \"acc_norm\": 0.7308305118502291,\n \"acc_norm_stderr\": 0.004426217654917996\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325583,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325583\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936338,\n \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936338\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178277,\n \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178277\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.02521731518484649,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.02521731518484649\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115006,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115006\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.02308663508684141,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.02308663508684141\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306386,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306386\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n \"acc_stderr\": 0.014776765066438888,\n \"acc_norm\": 0.2659217877094972,\n \"acc_norm_stderr\": 0.014776765066438888\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159624,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159624\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n \"acc_stderr\": 0.0274666102131401,\n \"acc_norm\": 0.6270096463022508,\n \"acc_norm_stderr\": 0.0274666102131401\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5864197530864198,\n \"acc_stderr\": 0.02740204204026996,\n \"acc_norm\": 0.5864197530864198,\n \"acc_norm_stderr\": 0.02740204204026996\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5179738562091504,\n \"acc_stderr\": 0.020214761037872404,\n \"acc_norm\": 0.5179738562091504,\n \"acc_norm_stderr\": 0.020214761037872404\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626304\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \"acc_stderr\": 0.0028227133223877035\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Nanbeige-16B-Base-32K-llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|arc:challenge|25_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|gsm8k|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hellaswag|10_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T17-54-07.755069.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["**/details_harness|winogrande|5_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T17-54-07.755069.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T17_54_07.755069", "path": ["results_2024-01-16T17-54-07.755069.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T17-54-07.755069.parquet"]}]}]} | 2024-01-16T17:56:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Nanbeige-16B-Base-32K-llama
Dataset automatically created during the evaluation run of model KnutJaegersberg/Nanbeige-16B-Base-32K-llama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T17:54:07.755069(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Nanbeige-16B-Base-32K-llama\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Nanbeige-16B-Base-32K-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T17:54:07.755069(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Nanbeige-16B-Base-32K-llama\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Nanbeige-16B-Base-32K-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T17:54:07.755069(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
efc5031ee999c4448efa85e74e8a9fcf4f8ac8ce |
# Instruct-Aira Dataset version 2.0
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/Nkluge-correa/Aira
- **Point of Contact:** [AIRES at PUCRS]([email protected])
### Dataset Summary
This dataset contains a collection of single-turn conversations between an assistant and a user. Conversations were generated by user interactions with already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc). The dataset is available in Portuguese and English.
### Supported Tasks and Leaderboards
This dataset can be utilized for various natural language processing tasks, including but not limited to:
- Language modeling.
- Question-answering systems.
- Chatbot development.
- Evaluation of language models.
- Alignment research.
### Languages
English and Portuguese.
## Dataset Structure
### Data Instances
The dataset consists of the following features:
- **Conversation ID:** Identifier of the conversation.
- **Conversations:** A list of dictionaries following a [chat format](https://github.com/huggingface/blog/blob/main/chat-templates.md).
### Data Fields
```python
[
{'role': 'user', 'content': 'What is a language model?'},
{'role': 'assistant', 'content': 'A language model is a probability distribution over a vocabulary.'},
]
```
### Data Splits
Available splits are `english` and `portuguese`.
```python
from datasets import load_dataset
dataset = load_dataset("nicholasKluge/instruct-aira-dataset-v2", split='portuguese')
```
## Dataset Creation
### Curation Rationale
This dataset was developed are part of [Nicholas Kluge's](https://nkluge-correa.github.io/) doctoral dissertation, "_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn.
### Source Data
#### Initial Data Collection and Normalization
All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.
#### Who are the source language producers?
All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.
### Annotations
#### Annotation process
All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.
#### Who are the annotators?
No annotators were used.
### Personal and Sensitive Information
No personal or sensitive information is part of this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
No considerations.
### Discussion of Biases
No considerations.
### Other Known Limitations
No considerations.
## Additional Information
### Dataset Curators
[Nicholas Kluge Corrêa](mailto:[email protected]).
### Licensing Information
This dataset is licensed under the [Apache License, version 2.0](LICENSE).
### Citation Information
```latex
@misc{nicholas22aira,
doi = {10.5281/zenodo.6989727},
url = {https://github.com/Nkluge-correa/Aira},
author = {Nicholas Kluge Corrêa},
title = {Aira},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
}
```
### Contributions
If you would like to contribute, contact me at [[email protected]](mailto:[email protected])!
| nicholasKluge/instruct-aira-dataset-v2 | [
"task_categories:conversational",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:pt",
"language:en",
"license:apache-2.0",
"alignment",
"instruction",
"chat",
"region:us"
] | 2024-01-16T18:01:58+00:00 | {"language": ["pt", "en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation"], "pretty_name": "Instruct-Aira Dataset version 2.0", "tags": ["alignment", "instruction", "chat"], "dataset_info": {"features": [{"name": "conversation_id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "portuguese", "num_bytes": 164002840, "num_examples": 81617}, {"name": "english", "num_bytes": 150265292, "num_examples": 81617}], "download_size": 169783232, "dataset_size": 314268132}, "configs": [{"config_name": "default", "data_files": [{"split": "portuguese", "path": "data/portuguese-*"}, {"split": "english", "path": "data/english-*"}]}]} | 2024-02-15T18:12:54+00:00 | [] | [
"pt",
"en"
] | TAGS
#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Portuguese #language-English #license-apache-2.0 #alignment #instruction #chat #region-us
|
# Instruct-Aira Dataset version 2.0
## Table of Contents
- Table of Contents
- Dataset Description
- Dataset Summary
- Supported Tasks and Leaderboards
- Languages
- Dataset Structure
- Data Instances
- Data Fields
- Data Splits
- Dataset Creation
- Curation Rationale
- Source Data
- Annotations
- Personal and Sensitive Information
- Considerations for Using the Data
- Social Impact of Dataset
- Discussion of Biases
- Other Known Limitations
- Additional Information
- Dataset Curators
- Licensing Information
- Citation Information
- Contributions
## Dataset Description
- Repository: URL
- Point of Contact: AIRES at PUCRS
### Dataset Summary
This dataset contains a collection of single-turn conversations between an assistant and a user. Conversations were generated by user interactions with already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc). The dataset is available in Portuguese and English.
### Supported Tasks and Leaderboards
This dataset can be utilized for various natural language processing tasks, including but not limited to:
- Language modeling.
- Question-answering systems.
- Chatbot development.
- Evaluation of language models.
- Alignment research.
### Languages
English and Portuguese.
## Dataset Structure
### Data Instances
The dataset consists of the following features:
- Conversation ID: Identifier of the conversation.
- Conversations: A list of dictionaries following a chat format.
### Data Fields
### Data Splits
Available splits are 'english' and 'portuguese'.
## Dataset Creation
### Curation Rationale
This dataset was developed are part of Nicholas Kluge's doctoral dissertation, "_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn.
### Source Data
#### Initial Data Collection and Normalization
All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.
#### Who are the source language producers?
All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.
### Annotations
#### Annotation process
All completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.
#### Who are the annotators?
No annotators were used.
### Personal and Sensitive Information
No personal or sensitive information is part of this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
No considerations.
### Discussion of Biases
No considerations.
### Other Known Limitations
No considerations.
## Additional Information
### Dataset Curators
Nicholas Kluge Corrêa.
### Licensing Information
This dataset is licensed under the Apache License, version 2.0.
### Contributions
If you would like to contribute, contact me at nicholas@URL!
| [
"# Instruct-Aira Dataset version 2.0",
"## Table of Contents\n\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Repository: URL\n- Point of Contact: AIRES at PUCRS",
"### Dataset Summary\n\nThis dataset contains a collection of single-turn conversations between an assistant and a user. Conversations were generated by user interactions with already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc). The dataset is available in Portuguese and English.",
"### Supported Tasks and Leaderboards\n\nThis dataset can be utilized for various natural language processing tasks, including but not limited to:\n\n- Language modeling.\n- Question-answering systems.\n- Chatbot development.\n- Evaluation of language models.\n- Alignment research.",
"### Languages\n\nEnglish and Portuguese.",
"## Dataset Structure",
"### Data Instances\n\nThe dataset consists of the following features:\n\n- Conversation ID: Identifier of the conversation.\n- Conversations: A list of dictionaries following a chat format.",
"### Data Fields",
"### Data Splits\n\nAvailable splits are 'english' and 'portuguese'.",
"## Dataset Creation",
"### Curation Rationale\n\nThis dataset was developed are part of Nicholas Kluge's doctoral dissertation, \"_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._\" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.",
"#### Who are the source language producers?\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.",
"### Annotations",
"#### Annotation process\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.",
"#### Who are the annotators?\n\nNo annotators were used.",
"### Personal and Sensitive Information\n\nNo personal or sensitive information is part of this dataset.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nNo considerations.",
"### Discussion of Biases\n\nNo considerations.",
"### Other Known Limitations\n\nNo considerations.",
"## Additional Information",
"### Dataset Curators\n\nNicholas Kluge Corrêa.",
"### Licensing Information\n\nThis dataset is licensed under the Apache License, version 2.0.",
"### Contributions\n\nIf you would like to contribute, contact me at nicholas@URL!"
] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-Portuguese #language-English #license-apache-2.0 #alignment #instruction #chat #region-us \n",
"# Instruct-Aira Dataset version 2.0",
"## Table of Contents\n\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Repository: URL\n- Point of Contact: AIRES at PUCRS",
"### Dataset Summary\n\nThis dataset contains a collection of single-turn conversations between an assistant and a user. Conversations were generated by user interactions with already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc). The dataset is available in Portuguese and English.",
"### Supported Tasks and Leaderboards\n\nThis dataset can be utilized for various natural language processing tasks, including but not limited to:\n\n- Language modeling.\n- Question-answering systems.\n- Chatbot development.\n- Evaluation of language models.\n- Alignment research.",
"### Languages\n\nEnglish and Portuguese.",
"## Dataset Structure",
"### Data Instances\n\nThe dataset consists of the following features:\n\n- Conversation ID: Identifier of the conversation.\n- Conversations: A list of dictionaries following a chat format.",
"### Data Fields",
"### Data Splits\n\nAvailable splits are 'english' and 'portuguese'.",
"## Dataset Creation",
"### Curation Rationale\n\nThis dataset was developed are part of Nicholas Kluge's doctoral dissertation, \"_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._\" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.",
"#### Who are the source language producers?\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.",
"### Annotations",
"#### Annotation process\n\nAll completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.). Prompts were gathered from publicly available datasets.",
"#### Who are the annotators?\n\nNo annotators were used.",
"### Personal and Sensitive Information\n\nNo personal or sensitive information is part of this dataset.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nNo considerations.",
"### Discussion of Biases\n\nNo considerations.",
"### Other Known Limitations\n\nNo considerations.",
"## Additional Information",
"### Dataset Curators\n\nNicholas Kluge Corrêa.",
"### Licensing Information\n\nThis dataset is licensed under the Apache License, version 2.0.",
"### Contributions\n\nIf you would like to contribute, contact me at nicholas@URL!"
] |
41fba50002448533bd32adf38820189826d935f6 |
# Dataset Card for Evaluation run of Swisslex/Mixtral-8x7b-DPO-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Swisslex/Mixtral-8x7b-DPO-v0.2](https://huggingface.co/Swisslex/Mixtral-8x7b-DPO-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T18:18:43.502951](https://huggingface.co/datasets/open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2/blob/main/results_2024-01-16T18-18-43.502951.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7090305152663554,
"acc_stderr": 0.030408745551927793,
"acc_norm": 0.7130050743913563,
"acc_norm_stderr": 0.03099893935186279,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.5869091785507645,
"mc2_stderr": 0.01561392560307738
},
"harness|arc:challenge|25": {
"acc": 0.6715017064846417,
"acc_stderr": 0.013724978465537293,
"acc_norm": 0.7039249146757679,
"acc_norm_stderr": 0.01334091608524626
},
"harness|hellaswag|10": {
"acc": 0.6922923720374428,
"acc_stderr": 0.004606015773125625,
"acc_norm": 0.8773152758414658,
"acc_norm_stderr": 0.0032740447231806155
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.030643607071677098,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.030643607071677098
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.02619980880756192,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.02619980880756192
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.03063557897209328,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.03063557897209328
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5,
"acc_stderr": 0.04975185951049946,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04975185951049946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.044629175353369376,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.044629175353369376
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.039215453124671215,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.039215453124671215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.02173254068932928,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.02173254068932928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6305418719211823,
"acc_stderr": 0.03395970381998575,
"acc_norm": 0.6305418719211823,
"acc_norm_stderr": 0.03395970381998575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656208,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656208
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223157,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223157
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.023119362758232297,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.023119362758232297
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.01370874953417264,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.01370874953417264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997865,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997865
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568624,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878456,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878456
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7533632286995515,
"acc_stderr": 0.028930413120910877,
"acc_norm": 0.7533632286995515,
"acc_norm_stderr": 0.028930413120910877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476076,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476076
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.029634717272371037,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.029634717272371037
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018533,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018533
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.876117496807152,
"acc_stderr": 0.011781017100950737,
"acc_norm": 0.876117496807152,
"acc_norm_stderr": 0.011781017100950737
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7861271676300579,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.7861271676300579,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.016588680864530626,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.016588680864530626
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340853,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340853
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.020263764996385714,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.020263764996385714
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5221642764015645,
"acc_stderr": 0.012757683047716184,
"acc_norm": 0.5221642764015645,
"acc_norm_stderr": 0.012757683047716184
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7830882352941176,
"acc_stderr": 0.025035845227711268,
"acc_norm": 0.7830882352941176,
"acc_norm_stderr": 0.025035845227711268
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399687,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399687
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.5869091785507645,
"mc2_stderr": 0.01561392560307738
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.01066518790249844
},
"harness|gsm8k|5": {
"acc": 0.5754359363153905,
"acc_stderr": 0.013614835574956387
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2 | [
"region:us"
] | 2024-01-16T18:21:03+00:00 | {"pretty_name": "Evaluation run of Swisslex/Mixtral-8x7b-DPO-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Swisslex/Mixtral-8x7b-DPO-v0.2](https://huggingface.co/Swisslex/Mixtral-8x7b-DPO-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T18:18:43.502951](https://huggingface.co/datasets/open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2/blob/main/results_2024-01-16T18-18-43.502951.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7090305152663554,\n \"acc_stderr\": 0.030408745551927793,\n \"acc_norm\": 0.7130050743913563,\n \"acc_norm_stderr\": 0.03099893935186279,\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5869091785507645,\n \"mc2_stderr\": 0.01561392560307738\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537293,\n \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.01334091608524626\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6922923720374428,\n \"acc_stderr\": 0.004606015773125625,\n \"acc_norm\": 0.8773152758414658,\n \"acc_norm_stderr\": 0.0032740447231806155\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677098,\n \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677098\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.02619980880756192,\n \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.02619980880756192\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.03063557897209328,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.03063557897209328\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.044629175353369376,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.044629175353369376\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.039215453124671215,\n \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.039215453124671215\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n \"acc_stderr\": 0.02173254068932928,\n \"acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.02173254068932928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.03395970381998575,\n \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.03395970381998575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656208,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656208\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.023119362758232297,\n \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.023119362758232297\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.026265024608275882,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.026265024608275882\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568624,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568624\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878456,\n \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878456\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n \"acc_stderr\": 0.028930413120910877,\n \"acc_norm\": 0.7533632286995515,\n \"acc_norm_stderr\": 0.028930413120910877\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476076,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476076\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371037,\n \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371037\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.876117496807152,\n \"acc_stderr\": 0.011781017100950737,\n \"acc_norm\": 0.876117496807152,\n \"acc_norm_stderr\": 0.011781017100950737\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.022075709251757183,\n \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.022075709251757183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n \"acc_stderr\": 0.016588680864530626,\n \"acc_norm\": 0.43687150837988825,\n \"acc_norm_stderr\": 0.016588680864530626\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340853,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340853\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385714,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385714\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5221642764015645,\n \"acc_stderr\": 0.012757683047716184,\n \"acc_norm\": 0.5221642764015645,\n \"acc_norm_stderr\": 0.012757683047716184\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7830882352941176,\n \"acc_stderr\": 0.025035845227711268,\n \"acc_norm\": 0.7830882352941176,\n \"acc_norm_stderr\": 0.025035845227711268\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.017322789207784326,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.017322789207784326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399687,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399687\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5869091785507645,\n \"mc2_stderr\": 0.01561392560307738\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.01066518790249844\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5754359363153905,\n \"acc_stderr\": 0.013614835574956387\n }\n}\n```", "repo_url": "https://huggingface.co/Swisslex/Mixtral-8x7b-DPO-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|arc:challenge|25_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|gsm8k|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hellaswag|10_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T18-18-43.502951.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["**/details_harness|winogrande|5_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T18-18-43.502951.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T18_18_43.502951", "path": ["results_2024-01-16T18-18-43.502951.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T18-18-43.502951.parquet"]}]}]} | 2024-01-16T18:21:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Swisslex/Mixtral-8x7b-DPO-v0.2
Dataset automatically created during the evaluation run of model Swisslex/Mixtral-8x7b-DPO-v0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T18:18:43.502951(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Swisslex/Mixtral-8x7b-DPO-v0.2\n\n\n\nDataset automatically created during the evaluation run of model Swisslex/Mixtral-8x7b-DPO-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T18:18:43.502951(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Swisslex/Mixtral-8x7b-DPO-v0.2\n\n\n\nDataset automatically created during the evaluation run of model Swisslex/Mixtral-8x7b-DPO-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T18:18:43.502951(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
80a516db100c74df90261bad9c8124c185d1c61e |
Filter zxbsmk/laion_text_debiased_60M by image size and get 512 subset(12,009,641 pairs), 768 subset(4,915,850 pairs), 1024 subset(1,985,026 pairs). | zxbsmk/laion_text_debiased_60M | [
"license:apache-2.0",
"region:us"
] | 2024-01-16T18:44:34+00:00 | {"license": "apache-2.0"} | 2024-01-17T05:08:01+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
Filter zxbsmk/laion_text_debiased_60M by image size and get 512 subset(12,009,641 pairs), 768 subset(4,915,850 pairs), 1024 subset(1,985,026 pairs). | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
336fe8541b595e51b35e2246c0cce8f0c1aaf75b | # Dataset Card for "covost2_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Codec-SUPERB/covost2_extract_unit | [
"region:us"
] | 2024-01-16T18:45:50+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k", "path": "data/encodec_24k-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "academicodec_hifi_16k_320d", "num_bytes": 203174296, "num_examples": 23778}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 203174296, "num_examples": 23778}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 304202488, "num_examples": 23778}, {"name": "audiodec_24k_320d", "num_bytes": 649246616, "num_examples": 23778}, {"name": "dac_16k", "num_bytes": 1275223416, "num_examples": 23778}, {"name": "dac_24k", "num_bytes": 3610151000, "num_examples": 23778}, {"name": "dac_44k", "num_bytes": 1075588320, "num_examples": 23778}, {"name": "encodec_24k", "num_bytes": 152981112, "num_examples": 23778}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 1624289624, "num_examples": 23778}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 1624289624, "num_examples": 23778}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 1624061016, "num_examples": 23778}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 815535192, "num_examples": 23778}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 1624061016, "num_examples": 23778}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 1624061016, "num_examples": 23778}, {"name": "speech_tokenizer_16k", "num_bytes": 406785816, "num_examples": 23778}], "download_size": 2582372226, "dataset_size": 16816824848}} | 2024-01-16T18:59:54+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "covost2_extract_unit"
More Information needed | [
"# Dataset Card for \"covost2_extract_unit\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"covost2_extract_unit\"\n\nMore Information needed"
] |
3797c1e4703ddfe4e5fa360395b1836fd570fa88 |
# Dataset Card for Evaluation run of AiMavenAi/AiMaven-SmartDawg-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AiMavenAi/AiMaven-SmartDawg-7b](https://huggingface.co/AiMavenAi/AiMaven-SmartDawg-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T18:46:13.340145](https://huggingface.co/datasets/open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b/blob/main/results_2024-01-16T18-46-13.340145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6302261241898747,
"acc_stderr": 0.03275821367319639,
"acc_norm": 0.6319182401020044,
"acc_norm_stderr": 0.033417578490707575,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447685,
"mc2": 0.5886125043102783,
"mc2_stderr": 0.015752842438606557
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726094,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946531
},
"harness|hellaswag|10": {
"acc": 0.6996614220274846,
"acc_stderr": 0.004574683373821048,
"acc_norm": 0.8716391157140012,
"acc_norm_stderr": 0.0033380760156172602
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481006,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481006
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335068,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335068
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343138,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343138
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.014419123980931895,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.014419123980931895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603742,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603742
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399677,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399677
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.019249785691717213,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.019249785691717213
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447685,
"mc2": 0.5886125043102783,
"mc2_stderr": 0.015752842438606557
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.011446280629262631
},
"harness|gsm8k|5": {
"acc": 0.5724033358605004,
"acc_stderr": 0.013627322286986808
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b | [
"region:us"
] | 2024-01-16T18:48:31+00:00 | {"pretty_name": "Evaluation run of AiMavenAi/AiMaven-SmartDawg-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [AiMavenAi/AiMaven-SmartDawg-7b](https://huggingface.co/AiMavenAi/AiMaven-SmartDawg-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T18:46:13.340145](https://huggingface.co/datasets/open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b/blob/main/results_2024-01-16T18-46-13.340145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6302261241898747,\n \"acc_stderr\": 0.03275821367319639,\n \"acc_norm\": 0.6319182401020044,\n \"acc_norm_stderr\": 0.033417578490707575,\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.01727001528447685,\n \"mc2\": 0.5886125043102783,\n \"mc2_stderr\": 0.015752842438606557\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726094,\n \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946531\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6996614220274846,\n \"acc_stderr\": 0.004574683373821048,\n \"acc_norm\": 0.8716391157140012,\n \"acc_norm_stderr\": 0.0033380760156172602\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467381,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467381\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481006,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481006\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335068,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335068\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343138,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343138\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n \"acc_stderr\": 0.014419123980931895,\n \"acc_norm\": 0.7956577266922095,\n \"acc_norm_stderr\": 0.014419123980931895\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291467,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291467\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n \"acc_stderr\": 0.012697046024399677,\n \"acc_norm\": 0.44654498044328556,\n \"acc_norm_stderr\": 0.012697046024399677\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.019249785691717213,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.019249785691717213\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.01727001528447685,\n \"mc2\": 0.5886125043102783,\n \"mc2_stderr\": 0.015752842438606557\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.011446280629262631\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5724033358605004,\n \"acc_stderr\": 0.013627322286986808\n }\n}\n```", "repo_url": "https://huggingface.co/AiMavenAi/AiMaven-SmartDawg-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|arc:challenge|25_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|gsm8k|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hellaswag|10_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T18-46-13.340145.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["**/details_harness|winogrande|5_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T18-46-13.340145.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T18_46_13.340145", "path": ["results_2024-01-16T18-46-13.340145.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T18-46-13.340145.parquet"]}]}]} | 2024-01-16T18:48:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AiMavenAi/AiMaven-SmartDawg-7b
Dataset automatically created during the evaluation run of model AiMavenAi/AiMaven-SmartDawg-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T18:46:13.340145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AiMavenAi/AiMaven-SmartDawg-7b\n\n\n\nDataset automatically created during the evaluation run of model AiMavenAi/AiMaven-SmartDawg-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T18:46:13.340145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AiMavenAi/AiMaven-SmartDawg-7b\n\n\n\nDataset automatically created during the evaluation run of model AiMavenAi/AiMaven-SmartDawg-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T18:46:13.340145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
31a4c47468a0ea7ddaae265be7df64a1c297028b | # Dataset Card for "10000-20000-ultrafeedback-binarized-preferences-cleaned-ita"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | giux78/10000-20000-ultrafeedback-binarized-preferences-cleaned-ita | [
"region:us"
] | 2024-01-16T19:05:18+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen-rating", "dtype": "float64"}, {"name": "chosen-model", "dtype": "string"}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected-rating", "dtype": "float64"}, {"name": "rejected-model", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 50545239, "num_examples": 10000}], "download_size": 18938268, "dataset_size": 50545239}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T19:05:40+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "10000-20000-ultrafeedback-binarized-preferences-cleaned-ita"
More Information needed | [
"# Dataset Card for \"10000-20000-ultrafeedback-binarized-preferences-cleaned-ita\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"10000-20000-ultrafeedback-binarized-preferences-cleaned-ita\"\n\nMore Information needed"
] |
e0f42dbce5ad08d23cb534e883bfdf8cb6084a8a |
# Dataset Card for Evaluation run of pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2](https://huggingface.co/pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pinkyponky__Mistral-7B-Instruct-Sft-Tuned-V0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T02:42:41.237336](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7B-Instruct-Sft-Tuned-V0.2/blob/main/results_2024-01-17T02-42-41.237336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5792563382823662,
"acc_stderr": 0.03354124617135213,
"acc_norm": 0.5824584741857151,
"acc_norm_stderr": 0.03422621063406686,
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.5065814487425014,
"mc2_stderr": 0.014840890941401303
},
"harness|arc:challenge|25": {
"acc": 0.5255972696245734,
"acc_stderr": 0.014592230885298962,
"acc_norm": 0.5733788395904437,
"acc_norm_stderr": 0.014453185592920295
},
"harness|hellaswag|10": {
"acc": 0.5903206532563234,
"acc_stderr": 0.004907694727935687,
"acc_norm": 0.7894841665006971,
"acc_norm_stderr": 0.004068418417275672
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.029445175328199596,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.029445175328199596
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.027666182075539638,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.027666182075539638
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264715,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5461538461538461,
"acc_stderr": 0.025242770987126184,
"acc_norm": 0.5461538461538461,
"acc_norm_stderr": 0.025242770987126184
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.031918633744784645,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.031918633744784645
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.744954128440367,
"acc_stderr": 0.01868850085653581,
"acc_norm": 0.744954128440367,
"acc_norm_stderr": 0.01868850085653581
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.03149328104507956,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.03149328104507956
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748928,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748928
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395958,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395958
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005566,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005566
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159614,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159614
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580215,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580215
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.019977422600227474,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.019977422600227474
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.5065814487425014,
"mc2_stderr": 0.014840890941401303
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702304
},
"harness|gsm8k|5": {
"acc": 0.45489006823351025,
"acc_stderr": 0.013716318771794606
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_pinkyponky__Mistral-7B-Instruct-sft-tuned-v0.2 | [
"region:us"
] | 2024-01-16T19:10:08+00:00 | {"pretty_name": "Evaluation run of pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2](https://huggingface.co/pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pinkyponky__Mistral-7B-Instruct-Sft-Tuned-V0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T02:42:41.237336](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7B-Instruct-Sft-Tuned-V0.2/blob/main/results_2024-01-17T02-42-41.237336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5792563382823662,\n \"acc_stderr\": 0.03354124617135213,\n \"acc_norm\": 0.5824584741857151,\n \"acc_norm_stderr\": 0.03422621063406686,\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.5065814487425014,\n \"mc2_stderr\": 0.014840890941401303\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5255972696245734,\n \"acc_stderr\": 0.014592230885298962,\n \"acc_norm\": 0.5733788395904437,\n \"acc_norm_stderr\": 0.014453185592920295\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5903206532563234,\n \"acc_stderr\": 0.004907694727935687,\n \"acc_norm\": 0.7894841665006971,\n \"acc_norm_stderr\": 0.004068418417275672\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.029445175328199596,\n \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.029445175328199596\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n \"acc_stderr\": 0.027666182075539638,\n \"acc_norm\": 0.6161290322580645,\n \"acc_norm_stderr\": 0.027666182075539638\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5461538461538461,\n \"acc_stderr\": 0.025242770987126184,\n \"acc_norm\": 0.5461538461538461,\n \"acc_norm_stderr\": 0.025242770987126184\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.031918633744784645,\n \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.031918633744784645\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.744954128440367,\n \"acc_stderr\": 0.01868850085653581,\n \"acc_norm\": 0.744954128440367,\n \"acc_norm_stderr\": 0.01868850085653581\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.03149328104507956,\n \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.03149328104507956\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.02363687331748928,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.02363687331748928\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n \"acc_stderr\": 0.015464676163395958,\n \"acc_norm\": 0.7509578544061303,\n \"acc_norm_stderr\": 0.015464676163395958\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n \"acc_stderr\": 0.015476515438005566,\n \"acc_norm\": 0.3106145251396648,\n \"acc_norm_stderr\": 0.015476515438005566\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159614,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159614\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n \"acc_stderr\": 0.012676014778580215,\n \"acc_norm\": 0.439374185136897,\n \"acc_norm_stderr\": 0.012676014778580215\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.019977422600227474,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.019977422600227474\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.5065814487425014,\n \"mc2_stderr\": 0.014840890941401303\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702304\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45489006823351025,\n \"acc_stderr\": 0.013716318771794606\n }\n}\n```", "repo_url": "https://huggingface.co/pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|arc:challenge|25_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|arc:challenge|25_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|gsm8k|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|gsm8k|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hellaswag|10_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hellaswag|10_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T19-07-53.653078.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T02-42-41.237336.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["**/details_harness|winogrande|5_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["**/details_harness|winogrande|5_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T02-42-41.237336.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T19_07_53.653078", "path": ["results_2024-01-16T19-07-53.653078.parquet"]}, {"split": "2024_01_17T02_42_41.237336", "path": ["results_2024-01-17T02-42-41.237336.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T02-42-41.237336.parquet"]}]}]} | 2024-01-17T02:45:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2
Dataset automatically created during the evaluation run of model pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-17T02:42:41.237336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2\n\n\n\nDataset automatically created during the evaluation run of model pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T02:42:41.237336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2\n\n\n\nDataset automatically created during the evaluation run of model pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T02:42:41.237336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f8e86ce41d56348fd770ae7c0c392ba740f852f0 |
# Dataset Card for Evaluation run of ajibawa-2023/Code-290k-13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ajibawa-2023/Code-290k-13B](https://huggingface.co/ajibawa-2023/Code-290k-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ajibawa-2023__Code-290k-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T19:33:39.851103](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-290k-13B/blob/main/results_2024-01-16T19-33-39.851103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5196636424489082,
"acc_stderr": 0.03413919567753767,
"acc_norm": 0.5255766098495468,
"acc_norm_stderr": 0.034888163510772265,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.01541524174023702,
"mc2": 0.3765086228702086,
"mc2_stderr": 0.01531404683044936
},
"harness|arc:challenge|25": {
"acc": 0.5332764505119454,
"acc_stderr": 0.01457899585960581,
"acc_norm": 0.560580204778157,
"acc_norm_stderr": 0.014503747823580122
},
"harness|hellaswag|10": {
"acc": 0.6281617207727545,
"acc_stderr": 0.004823078145064964,
"acc_norm": 0.8154750049790879,
"acc_norm_stderr": 0.0038711896202760715
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.04060127035236395,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.04060127035236395
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5471698113207547,
"acc_stderr": 0.03063562795796182,
"acc_norm": 0.5471698113207547,
"acc_norm_stderr": 0.03063562795796182
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.02479606060269995,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.02479606060269995
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6313131313131313,
"acc_stderr": 0.03437305501980619,
"acc_norm": 0.6313131313131313,
"acc_norm_stderr": 0.03437305501980619
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230172,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230172
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.48739495798319327,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.48739495798319327,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6954128440366972,
"acc_stderr": 0.01973229942035406,
"acc_norm": 0.6954128440366972,
"acc_norm_stderr": 0.01973229942035406
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.0313217980308329,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.0313217980308329
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138605,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138605
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.033141902221106564,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.033141902221106564
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.588957055214724,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.588957055214724,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196704,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196704
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7113665389527458,
"acc_stderr": 0.01620379270319778,
"acc_norm": 0.7113665389527458,
"acc_norm_stderr": 0.01620379270319778
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.569364161849711,
"acc_stderr": 0.026658800273672376,
"acc_norm": 0.569364161849711,
"acc_norm_stderr": 0.026658800273672376
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808852,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808852
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.02843109544417664,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.02843109544417664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946205,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199502,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199502
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40352020860495436,
"acc_stderr": 0.012530241301193179,
"acc_norm": 0.40352020860495436,
"acc_norm_stderr": 0.012530241301193179
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.030254372573976694,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.030254372573976694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.020223946005074305,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.020223946005074305
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.03345563070339193,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.03345563070339193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.01541524174023702,
"mc2": 0.3765086228702086,
"mc2_stderr": 0.01531404683044936
},
"harness|winogrande|5": {
"acc": 0.7269139700078927,
"acc_stderr": 0.012522020105869456
},
"harness|gsm8k|5": {
"acc": 0.17816527672479152,
"acc_stderr": 0.010540132527549487
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ajibawa-2023__Code-290k-13B | [
"region:us"
] | 2024-01-16T19:36:03+00:00 | {"pretty_name": "Evaluation run of ajibawa-2023/Code-290k-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ajibawa-2023/Code-290k-13B](https://huggingface.co/ajibawa-2023/Code-290k-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ajibawa-2023__Code-290k-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T19:33:39.851103](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-290k-13B/blob/main/results_2024-01-16T19-33-39.851103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5196636424489082,\n \"acc_stderr\": 0.03413919567753767,\n \"acc_norm\": 0.5255766098495468,\n \"acc_norm_stderr\": 0.034888163510772265,\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.01541524174023702,\n \"mc2\": 0.3765086228702086,\n \"mc2_stderr\": 0.01531404683044936\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5332764505119454,\n \"acc_stderr\": 0.01457899585960581,\n \"acc_norm\": 0.560580204778157,\n \"acc_norm_stderr\": 0.014503747823580122\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6281617207727545,\n \"acc_stderr\": 0.004823078145064964,\n \"acc_norm\": 0.8154750049790879,\n \"acc_norm_stderr\": 0.0038711896202760715\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.04060127035236395,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.04060127035236395\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5471698113207547,\n \"acc_stderr\": 0.03063562795796182,\n \"acc_norm\": 0.5471698113207547,\n \"acc_norm_stderr\": 0.03063562795796182\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.02479606060269995,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.02479606060269995\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.034454876862647144,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.034454876862647144\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6313131313131313,\n \"acc_stderr\": 0.03437305501980619,\n \"acc_norm\": 0.6313131313131313,\n \"acc_norm_stderr\": 0.03437305501980619\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.025342671293807257,\n \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.025342671293807257\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230172,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230172\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.48739495798319327,\n \"acc_stderr\": 0.03246816765752174,\n \"acc_norm\": 0.48739495798319327,\n \"acc_norm_stderr\": 0.03246816765752174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6954128440366972,\n \"acc_stderr\": 0.01973229942035406,\n \"acc_norm\": 0.6954128440366972,\n \"acc_norm_stderr\": 0.01973229942035406\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.0313217980308329,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.0313217980308329\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138605,\n \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138605\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.57847533632287,\n \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.588957055214724,\n \"acc_stderr\": 0.038656978537853624,\n \"acc_norm\": 0.588957055214724,\n \"acc_norm_stderr\": 0.038656978537853624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.027236013946196704,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.027236013946196704\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n \"acc_stderr\": 0.01620379270319778,\n \"acc_norm\": 0.7113665389527458,\n \"acc_norm_stderr\": 0.01620379270319778\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.569364161849711,\n \"acc_stderr\": 0.026658800273672376,\n \"acc_norm\": 0.569364161849711,\n \"acc_norm_stderr\": 0.026658800273672376\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808852,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808852\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.02843109544417664,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.02843109544417664\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n \"acc_stderr\": 0.028071928247946205,\n \"acc_norm\": 0.5755627009646302,\n \"acc_norm_stderr\": 0.028071928247946205\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199502,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199502\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40352020860495436,\n \"acc_stderr\": 0.012530241301193179,\n \"acc_norm\": 0.40352020860495436,\n \"acc_norm_stderr\": 0.012530241301193179\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.030254372573976694,\n \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.030254372573976694\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.020223946005074305,\n \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.020223946005074305\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n \"acc_stderr\": 0.03345563070339193,\n \"acc_norm\": 0.6616915422885572,\n \"acc_norm_stderr\": 0.03345563070339193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.01541524174023702,\n \"mc2\": 0.3765086228702086,\n \"mc2_stderr\": 0.01531404683044936\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7269139700078927,\n \"acc_stderr\": 0.012522020105869456\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17816527672479152,\n \"acc_stderr\": 0.010540132527549487\n }\n}\n```", "repo_url": "https://huggingface.co/ajibawa-2023/Code-290k-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|arc:challenge|25_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|gsm8k|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hellaswag|10_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T19-33-39.851103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["**/details_harness|winogrande|5_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T19-33-39.851103.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T19_33_39.851103", "path": ["results_2024-01-16T19-33-39.851103.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T19-33-39.851103.parquet"]}]}]} | 2024-01-16T19:36:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ajibawa-2023/Code-290k-13B
Dataset automatically created during the evaluation run of model ajibawa-2023/Code-290k-13B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T19:33:39.851103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ajibawa-2023/Code-290k-13B\n\n\n\nDataset automatically created during the evaluation run of model ajibawa-2023/Code-290k-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T19:33:39.851103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ajibawa-2023/Code-290k-13B\n\n\n\nDataset automatically created during the evaluation run of model ajibawa-2023/Code-290k-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T19:33:39.851103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0e3062c1f1b0f690494844c4465e220fa4ec8260 |
# Dataset Card for Wildberries products
### Dataset Summary
This dataset was scraped from product pages on the Russian marketplace [Wildberries](https://www.wildberries.ru). It includes all information from the product card and metadata from the API, excluding image URLs. The dataset was collected by processing approximately 160 million products out of a potential 230 million, starting from the first product. Data collection had to be stopped due to serious rate limits that prevented further progress. The data is in zstd archives containing jsonl files. Each archive contains data from a specific Wildberries data server identified by a basket server number.
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
### Data Fields
This dataset includes the following fields:
- `imt_id`: Identifier for the item (integer)
- `nm_id`: Numeric identifier associated with the item (integer)
- `imt_name`: Name of the product (string)
- `subj_name`: Subject name (string)
- `subj_root_name`: Root subject name (string)
- `nm_colors_names`: Colors names (string, may be empty)
- `vendor_code`: Vendor code (string)
- `description`: Description of the product (string, may be empty)
- `brand_name`: Name of the brand (string)
### Data Splits
All examples are in the train split, there is no validation split.
## Additional Information
### License
This dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:
* Use it for any purpose, including commercial projects.
* Modify it however you like.
* Distribute it without asking permission.
No attribution is required, but it's always appreciated!
CC0 license: https://creativecommons.org/publicdomain/zero/1.0/deed.en
To learn more about CC0, visit the Creative Commons website: https://creativecommons.org/publicdomain/zero/1.0/
### Dataset Curators
- [nyuuzyou](https://ducks.party)
| nyuuzyou/wb-products | [
"task_categories:text-generation",
"task_ids:language-modeling",
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"multilinguality:monolingual",
"size_categories:100M<n<1B",
"source_datasets:original",
"language:ru",
"license:cc0-1.0",
"region:us"
] | 2024-01-16T19:42:07+00:00 | {"annotations_creators": ["crowdsourced"], "language_creators": ["crowdsourced"], "language": ["ru"], "license": ["cc0-1.0"], "multilinguality": ["monolingual"], "size_categories": ["100M<n<1B"], "source_datasets": ["original"], "task_categories": ["text-generation"], "task_ids": ["language-modeling"], "pretty_name": "Wildberries products"} | 2024-01-16T20:06:09+00:00 | [] | [
"ru"
] | TAGS
#task_categories-text-generation #task_ids-language-modeling #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-monolingual #size_categories-100M<n<1B #source_datasets-original #language-Russian #license-cc0-1.0 #region-us
|
# Dataset Card for Wildberries products
### Dataset Summary
This dataset was scraped from product pages on the Russian marketplace Wildberries. It includes all information from the product card and metadata from the API, excluding image URLs. The dataset was collected by processing approximately 160 million products out of a potential 230 million, starting from the first product. Data collection had to be stopped due to serious rate limits that prevented further progress. The data is in zstd archives containing jsonl files. Each archive contains data from a specific Wildberries data server identified by a basket server number.
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
### Data Fields
This dataset includes the following fields:
- 'imt_id': Identifier for the item (integer)
- 'nm_id': Numeric identifier associated with the item (integer)
- 'imt_name': Name of the product (string)
- 'subj_name': Subject name (string)
- 'subj_root_name': Root subject name (string)
- 'nm_colors_names': Colors names (string, may be empty)
- 'vendor_code': Vendor code (string)
- 'description': Description of the product (string, may be empty)
- 'brand_name': Name of the brand (string)
### Data Splits
All examples are in the train split, there is no validation split.
## Additional Information
### License
This dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:
* Use it for any purpose, including commercial projects.
* Modify it however you like.
* Distribute it without asking permission.
No attribution is required, but it's always appreciated!
CC0 license: URL
To learn more about CC0, visit the Creative Commons website: URL
### Dataset Curators
- nyuuzyou
| [
"# Dataset Card for Wildberries products",
"### Dataset Summary\n\nThis dataset was scraped from product pages on the Russian marketplace Wildberries. It includes all information from the product card and metadata from the API, excluding image URLs. The dataset was collected by processing approximately 160 million products out of a potential 230 million, starting from the first product. Data collection had to be stopped due to serious rate limits that prevented further progress. The data is in zstd archives containing jsonl files. Each archive contains data from a specific Wildberries data server identified by a basket server number.",
"### Languages\n\nThe dataset is mostly in Russian, but there may be other languages present.",
"## Dataset Structure",
"### Data Fields\n\nThis dataset includes the following fields:\n\n- 'imt_id': Identifier for the item (integer)\n- 'nm_id': Numeric identifier associated with the item (integer)\n- 'imt_name': Name of the product (string)\n- 'subj_name': Subject name (string)\n- 'subj_root_name': Root subject name (string)\n- 'nm_colors_names': Colors names (string, may be empty)\n- 'vendor_code': Vendor code (string)\n- 'description': Description of the product (string, may be empty)\n- 'brand_name': Name of the brand (string)",
"### Data Splits\n\nAll examples are in the train split, there is no validation split.",
"## Additional Information",
"### License\n\nThis dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:\n\n* Use it for any purpose, including commercial projects.\n* Modify it however you like.\n* Distribute it without asking permission.\n\nNo attribution is required, but it's always appreciated!\n\nCC0 license: URL\n\nTo learn more about CC0, visit the Creative Commons website: URL",
"### Dataset Curators\n\n- nyuuzyou"
] | [
"TAGS\n#task_categories-text-generation #task_ids-language-modeling #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-monolingual #size_categories-100M<n<1B #source_datasets-original #language-Russian #license-cc0-1.0 #region-us \n",
"# Dataset Card for Wildberries products",
"### Dataset Summary\n\nThis dataset was scraped from product pages on the Russian marketplace Wildberries. It includes all information from the product card and metadata from the API, excluding image URLs. The dataset was collected by processing approximately 160 million products out of a potential 230 million, starting from the first product. Data collection had to be stopped due to serious rate limits that prevented further progress. The data is in zstd archives containing jsonl files. Each archive contains data from a specific Wildberries data server identified by a basket server number.",
"### Languages\n\nThe dataset is mostly in Russian, but there may be other languages present.",
"## Dataset Structure",
"### Data Fields\n\nThis dataset includes the following fields:\n\n- 'imt_id': Identifier for the item (integer)\n- 'nm_id': Numeric identifier associated with the item (integer)\n- 'imt_name': Name of the product (string)\n- 'subj_name': Subject name (string)\n- 'subj_root_name': Root subject name (string)\n- 'nm_colors_names': Colors names (string, may be empty)\n- 'vendor_code': Vendor code (string)\n- 'description': Description of the product (string, may be empty)\n- 'brand_name': Name of the brand (string)",
"### Data Splits\n\nAll examples are in the train split, there is no validation split.",
"## Additional Information",
"### License\n\nThis dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:\n\n* Use it for any purpose, including commercial projects.\n* Modify it however you like.\n* Distribute it without asking permission.\n\nNo attribution is required, but it's always appreciated!\n\nCC0 license: URL\n\nTo learn more about CC0, visit the Creative Commons website: URL",
"### Dataset Curators\n\n- nyuuzyou"
] |
4e6b2979a1a3138c6b327c3d16b33840b6bb3d5f |
# Dataset Card for Evaluation run of PotatoOff/HamSter-0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PotatoOff/HamSter-0.2](https://huggingface.co/PotatoOff/HamSter-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PotatoOff__HamSter-0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T20:12:25.047225](https://huggingface.co/datasets/open-llm-leaderboard/details_PotatoOff__HamSter-0.2/blob/main/results_2024-01-16T20-12-25.047225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4993855534029302,
"acc_stderr": 0.034244491357846386,
"acc_norm": 0.5077537035345174,
"acc_norm_stderr": 0.03517731824473503,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.49629739509694737,
"mc2_stderr": 0.015731600227202613
},
"harness|arc:challenge|25": {
"acc": 0.4786689419795222,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5008532423208191,
"acc_norm_stderr": 0.014611369529813272
},
"harness|hellaswag|10": {
"acc": 0.5668193586934873,
"acc_stderr": 0.0049450236570322765,
"acc_norm": 0.7365066719776937,
"acc_norm_stderr": 0.004396273173717463
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.045378153549393924,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.045378153549393924
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028424,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6,
"acc_stderr": 0.027869320571664625,
"acc_norm": 0.6,
"acc_norm_stderr": 0.027869320571664625
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916646,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49230769230769234,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.49230769230769234,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.653211009174312,
"acc_stderr": 0.020406097104093024,
"acc_norm": 0.653211009174312,
"acc_norm_stderr": 0.020406097104093024
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.03324708911809118,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.03324708911809118
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906274,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906274
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138937,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138937
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.04802694698258973,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.04802694698258973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009157,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009157
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6781609195402298,
"acc_stderr": 0.016706381415057904,
"acc_norm": 0.6781609195402298,
"acc_norm_stderr": 0.016706381415057904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2837988826815642,
"acc_stderr": 0.015078358970751765,
"acc_norm": 0.2837988826815642,
"acc_norm_stderr": 0.015078358970751765
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5065359477124183,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.5065359477124183,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.028320325830105908,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.028320325830105908
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668773,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668773
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115886,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.01233739168453031,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.01233739168453031
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714874,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714874
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0201965949335412,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0201965949335412
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.49629739509694737,
"mc2_stderr": 0.015731600227202613
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634463
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_PotatoOff__HamSter-0.2 | [
"region:us"
] | 2024-01-16T20:14:40+00:00 | {"pretty_name": "Evaluation run of PotatoOff/HamSter-0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [PotatoOff/HamSter-0.2](https://huggingface.co/PotatoOff/HamSter-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PotatoOff__HamSter-0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T20:12:25.047225](https://huggingface.co/datasets/open-llm-leaderboard/details_PotatoOff__HamSter-0.2/blob/main/results_2024-01-16T20-12-25.047225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4993855534029302,\n \"acc_stderr\": 0.034244491357846386,\n \"acc_norm\": 0.5077537035345174,\n \"acc_norm_stderr\": 0.03517731824473503,\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.49629739509694737,\n \"mc2_stderr\": 0.015731600227202613\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4786689419795222,\n \"acc_stderr\": 0.014598087973127106,\n \"acc_norm\": 0.5008532423208191,\n \"acc_norm_stderr\": 0.014611369529813272\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5668193586934873,\n \"acc_stderr\": 0.0049450236570322765,\n \"acc_norm\": 0.7365066719776937,\n \"acc_norm_stderr\": 0.004396273173717463\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.045378153549393924,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.045378153549393924\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.027869320571664625,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.027869320571664625\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.03318477333845331,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.03318477333845331\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916646,\n \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916646\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.49230769230769234,\n \"acc_stderr\": 0.025348006031534778,\n \"acc_norm\": 0.49230769230769234,\n \"acc_norm_stderr\": 0.025348006031534778\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115007,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.653211009174312,\n \"acc_stderr\": 0.020406097104093024,\n \"acc_norm\": 0.653211009174312,\n \"acc_norm_stderr\": 0.020406097104093024\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.03324708911809118,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.03324708911809118\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906274,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906274\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138937,\n \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138937\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.04802694698258973,\n \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.04802694698258973\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n \"acc_stderr\": 0.026853450377009157,\n \"acc_norm\": 0.7863247863247863,\n \"acc_norm_stderr\": 0.026853450377009157\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n \"acc_stderr\": 0.016706381415057904,\n \"acc_norm\": 0.6781609195402298,\n \"acc_norm_stderr\": 0.016706381415057904\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n \"acc_stderr\": 0.015078358970751765,\n \"acc_norm\": 0.2837988826815642,\n \"acc_norm_stderr\": 0.015078358970751765\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556054,\n \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556054\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n \"acc_stderr\": 0.028320325830105908,\n \"acc_norm\": 0.5369774919614148,\n \"acc_norm_stderr\": 0.028320325830105908\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668773,\n \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668773\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115886,\n \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115886\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n \"acc_stderr\": 0.01233739168453031,\n \"acc_norm\": 0.3709256844850065,\n \"acc_norm_stderr\": 0.01233739168453031\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714874,\n \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714874\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0201965949335412,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0201965949335412\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.49629739509694737,\n \"mc2_stderr\": 0.015731600227202613\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634463\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/PotatoOff/HamSter-0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|arc:challenge|25_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|gsm8k|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hellaswag|10_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T20-12-25.047225.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["**/details_harness|winogrande|5_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T20-12-25.047225.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T20_12_25.047225", "path": ["results_2024-01-16T20-12-25.047225.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T20-12-25.047225.parquet"]}]}]} | 2024-01-16T20:15:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PotatoOff/HamSter-0.2
Dataset automatically created during the evaluation run of model PotatoOff/HamSter-0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T20:12:25.047225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of PotatoOff/HamSter-0.2\n\n\n\nDataset automatically created during the evaluation run of model PotatoOff/HamSter-0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T20:12:25.047225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PotatoOff/HamSter-0.2\n\n\n\nDataset automatically created during the evaluation run of model PotatoOff/HamSter-0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T20:12:25.047225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
56ac81124021b00dc242614327c9246282024033 |
# Dataset Card for Evaluation run of flemmingmiguel/DareBeagle-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/DareBeagle-7B](https://huggingface.co/flemmingmiguel/DareBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__DareBeagle-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T20:16:03.594958](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__DareBeagle-7B/blob/main/results_2024-01-16T20-16-03.594958.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6572828321826875,
"acc_stderr": 0.03195518858047768,
"acc_norm": 0.6570740734999085,
"acc_norm_stderr": 0.032615184025325115,
"mc1": 0.5532435740514076,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6829983144235686,
"mc2_stderr": 0.014999747071250642
},
"harness|arc:challenge|25": {
"acc": 0.6877133105802048,
"acc_stderr": 0.013542598541688065,
"acc_norm": 0.7158703071672355,
"acc_norm_stderr": 0.013179442447653886
},
"harness|hellaswag|10": {
"acc": 0.7044413463453495,
"acc_stderr": 0.0045536094057472215,
"acc_norm": 0.8798048197570205,
"acc_norm_stderr": 0.0032452503945652944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5532435740514076,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6829983144235686,
"mc2_stderr": 0.014999747071250642
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613983
},
"harness|gsm8k|5": {
"acc": 0.711144806671721,
"acc_stderr": 0.012484219800126666
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_flemmingmiguel__DareBeagle-7B | [
"region:us"
] | 2024-01-16T20:18:21+00:00 | {"pretty_name": "Evaluation run of flemmingmiguel/DareBeagle-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [flemmingmiguel/DareBeagle-7B](https://huggingface.co/flemmingmiguel/DareBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__DareBeagle-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T20:16:03.594958](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__DareBeagle-7B/blob/main/results_2024-01-16T20-16-03.594958.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6572828321826875,\n \"acc_stderr\": 0.03195518858047768,\n \"acc_norm\": 0.6570740734999085,\n \"acc_norm_stderr\": 0.032615184025325115,\n \"mc1\": 0.5532435740514076,\n \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6829983144235686,\n \"mc2_stderr\": 0.014999747071250642\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6877133105802048,\n \"acc_stderr\": 0.013542598541688065,\n \"acc_norm\": 0.7158703071672355,\n \"acc_norm_stderr\": 0.013179442447653886\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7044413463453495,\n \"acc_stderr\": 0.0045536094057472215,\n \"acc_norm\": 0.8798048197570205,\n \"acc_norm_stderr\": 0.0032452503945652944\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5532435740514076,\n \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6829983144235686,\n \"mc2_stderr\": 0.014999747071250642\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613983\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.711144806671721,\n \"acc_stderr\": 0.012484219800126666\n }\n}\n```", "repo_url": "https://huggingface.co/flemmingmiguel/DareBeagle-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|arc:challenge|25_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|gsm8k|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hellaswag|10_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T20-16-03.594958.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["**/details_harness|winogrande|5_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T20-16-03.594958.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T20_16_03.594958", "path": ["results_2024-01-16T20-16-03.594958.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T20-16-03.594958.parquet"]}]}]} | 2024-01-16T20:18:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of flemmingmiguel/DareBeagle-7B
Dataset automatically created during the evaluation run of model flemmingmiguel/DareBeagle-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T20:16:03.594958(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of flemmingmiguel/DareBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/DareBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T20:16:03.594958(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of flemmingmiguel/DareBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/DareBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T20:16:03.594958(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7a8df17317114e10ef25390654ec720d3484e2c3 | Acoustic Waveform Airway and Respiratory Examination (AWARE/PTEase) Dataset
===================================================================================================
Guidelines
===================================================================================================
AWARE/PTEase is a smartphone-based sensing system that examines human airway's internal physiological conditions, developed by the Intelligent Systems Laboratory at University of Pittsburgh. AWARE/PTEase probes the airway with acoustic pulses through mouth, and collect the airway's reflections for analysis. Please refer to [our paper](https://doi.org/10.1145/3581791.3596854) and [github repo](https://github.com/pittisl/PTEase) for more details.
This dataset includes the raw and pre-processed acoustic data of airway measurements collected from 382 human subjects recruited at the Children’s Hospital of Pittsburgh. Demographics of the recruited human subjects are as follows:
| Category | Characteristics | Number |
|--------------------|----------------------|----------------|
| Demographics | Age (years) | 22.55 ± 15.63 |
| | Adults (%) | 166 (43.46) |
| | Female (%) | 205 (53.66) |
| | Caucasian (%) | 287 (75.13) |
| | African-American (%) | 93 (24.35) |
| Body Conditions | Height (cm) | 159.76 ± 15.67 |
| | Weight (kg) | 65.79 ± 27.28 |
| Disease Conditions | Healthy (%) | 115 (30.10) |
| | Asthma (%) | 190 (49.74) |
| | Cystic Fibrosis (%) | 64 (16.75) |
| | COPD (%) | 8 (2.09) |
| | Others (%) | 5 (1.31) |
The contents of this dataset include three parts:
- Raw WAV files
- "aware_raw.zip"
- Subfolders are named by the subjects' ID (from 1 to 382). In each subfolder, there are three "cali_X.wav" files and multiple "test_X.wav" files
- "cali_1.wav" : Calibration step 1 - long tube calibration - consists of 12 pairs of received pulses. Each pair contains one wide-band (0-16Hz) pulse and one narrow-band (0-200Hz) pulse. The narrow-band pulses are deperacated and only the wide-band pulses are used. Transmitting period for each pair is 0.4 sec.
- "cali_2.wav" : Calibration step 2 - standard tube close-end calibration - is the same as above except the calibrations are conducted in a different setting.
- "cali_3.wav" : Calibration step 3 - standard tube open-end calibration - is the same as above except the calibrations are conducted in a different setting.
- "test_X.wav" : Test number X - the subject's X-th attempt of conducting the test - follows the protocol of "Nasal breathing -> Inhale -> Exhale -> Inhale -> Exhale -> Inhale -> Exhale". Each segment contains 25 pulses collected at an interval of 0.2 sec. NOTE: Some of the samples have 50 pulses for the "Nasal breathing" phase.
- Segmented and aligned acoustic pulses
- "aware_segmented.pkl"
- Python pandas.DataFrame saved as a PICKLE file. Same contents as "aware_segmented.csv", but without flattening.
- To load the data, use the following commands:
``` python
import pandas as pd
df = pd.read_pickle("aware_segmented.pkl")
```
- Column 0-45 : Basic subject information, including demographics, clinical measurements and labels.
- Column 0 : Subject ID
- Column 1-9 : Demographics
- Column 10 : Disease label
- Column 11-21 : Impulse Oscillometry Measurements
- Column 22-30 : Spirometry measurements
- Column 31-44 : Spirometry measurements in post-bronchodilator test
- Column 45 : Test number
- Column 46-55 : Acoustic signals (16-bit PCM, sample rate = 48000) arranged by different phases. Each calibration phase contains 12 measurements and each testing phase contains 25 measurements. Each measurement lasts for 0.1 sec (4800 sample points).
- Column 46 : Phase "Calibration #1". Each element is a [12,4800] NumPy ndarray
- Column 47 : Phase "Calibration #2". Each element is a [12,4800] NumPy ndarray
- Column 48 : Phase "Calibration #3". Each element is a [12,4800] NumPy ndarray
- Column 49 : Phase "Nasal breathing". Each element is a [25,4800] NumPy ndarray
- Column 50 : Phase "Inhale #1". Each element is a [25,4800] NumPy ndarray
- Column 51 : Phase "Inhale #2". Each element is a [25,4800] NumPy ndarray
- Column 52 : Phase "Inhale #3". Each element is a [25,4800] NumPy ndarray
- Column 53 : Phase "Exhale #1". Each element is a [25,4800] NumPy ndarray
- Column 54 : Phase "Exhale #2". Each element is a [25,4800] NumPy ndarray
- Column 55 : Phase "Exhale #3". Each element is a [25,4800] NumPy ndarray
- "aware_segmented.csv" and "headers.txt"
- NOTE: Directly loading this CSV file is not recommended. It could be time-consuming and buggy. Try "aware_segmented.pkl" first.
- NOTE: "aware_segmented.csv" has no headers. See column definition below.
- Column 0-45 : Basic subject information, including demographics, clinical measurements and labels.
- Column 0 : Subject ID
- Column 1-9 : Demographics
- Column 10 : Disease label
- Column 11-21 : Impulse Oscillometry Measurements
- Column 22-30 : Spirometry measurements
- Column 31-44 : Spirometry measurements in post-bronchodilator test
- Column 45 : Test number
- Column 46-1012845 : Acoustic signals (16-bit PCM, sample rate = 48000) arranged by different phases. Each calibration phase contains 12 measurements and each testing phase contains 25 measurements. Each measurement lasts for 0.1 sec (4800 sample points).
- Column 46-57645 : Phase "Calibration #1". Flattened from [12,4800] by row-major order.
- Column 57646-115245 : Phase "Calibration #2". Flattened from [12,4800] by row-major order.
- Column 115246-172845 : Phase "Calibration #3". Flattened from [12,4800] by row-major order.
- Column 172846-292845 : Phase "Nasal breathing". Flattened from [25,4800] by row-major order.
- Column 292846-412845 : Phase "Inhale #1". Flattened from [25,4800] by row-major order.
- Column 412846-532845 : Phase "Inhale #2". Flattened from [25,4800] by row-major order.
- Column 532846-652845 : Phase "Inhale #3". Flattened from [25,4800] by row-major order.
- Column 652846-772845 : Phase "Exhale #1". Flattened from [25,4800] by row-major order.
- Column 772846-892845 : Phase "Exhale #2". Flattened from [25,4800] by row-major order.
- Column 892846-1012845 : Phase "Exhale #3". Flattened from [25,4800] by row-major order.
- Processed airway cross-sectional area (CSA) curves inferred from the acoustic signals
- "aware_csa.csv"
- NOTE: Not all raw samples are used for airway CSA reconstructions. Low-quality data is discarded, thus the sample size of this part is smaller than above
- Column 0-45 : Basic subject information. Same as "aware_segmented.csv".
- Column 46-129: Reconstructed airway cross-sectional area at different depths from mouth.
For more information about this dataset please contact: [email protected]
Citation
===================================================================================================
Use of this dataset in publications must be acknowledged by referencing the following publication:
```
@inproceedings{yin2023ptease,
title={PTEase: Objective Airway Examination for Pulmonary Telemedicine using Commodity Smartphones},
author={Yin, Xiangyu and Huang, Kai and Forno, Erick and Chen, Wei and Huang, Heng and Gao, Wei},
booktitle={Proceedings of the 21st Annual International Conference on Mobile Systems, Applications and Services},
pages={110--123},
year={2023},
url = {https://doi.org/10.1145/3581791.3596854},
doi = {10.1145/3581791.3596854}
}
```
Other Related Publications:
===================================================================================================
```
@inproceedings{yin2022out,
title={Out-Clinic Pulmonary Disease Evaluation via Acoustic Sensing and Multi-Task Learning on Commodity Smartphones},
author={Yin, Xiangyu and Huang, Kai and Forno, Erick and Chen, Wei and Huang, Heng and Gao, Wei},
booktitle={Proceedings of the 20th ACM Conference on Embedded Networked Sensor Systems},
pages={1182--1188},
year={2022},
url = {https://doi.org/10.1145/3560905.3568437},
doi = {10.1145/3560905.3568437}
}
```
License
===================================================================================================
[![CC BY-NC-SA 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg
| ericyxy98/AWARE | [
"task_categories:audio-classification",
"license:cc-by-nc-sa-4.0",
"medical",
"region:us"
] | 2024-01-16T20:21:38+00:00 | {"license": "cc-by-nc-sa-4.0", "task_categories": ["audio-classification"], "tags": ["medical"], "configs": [{"config_name": "default", "data_files": [{"split": "csa", "path": "aware_csa.csv"}]}]} | 2024-01-18T20:16:50+00:00 | [] | [] | TAGS
#task_categories-audio-classification #license-cc-by-nc-sa-4.0 #medical #region-us
| Acoustic Waveform Airway and Respiratory Examination (AWARE/PTEase) Dataset
===========================================================================
Guidelines
==========
AWARE/PTEase is a smartphone-based sensing system that examines human airway's internal physiological conditions, developed by the Intelligent Systems Laboratory at University of Pittsburgh. AWARE/PTEase probes the airway with acoustic pulses through mouth, and collect the airway's reflections for analysis. Please refer to our paper and github repo for more details.
This dataset includes the raw and pre-processed acoustic data of airway measurements collected from 382 human subjects recruited at the Children’s Hospital of Pittsburgh. Demographics of the recruited human subjects are as follows:
Category: Demographics, Characteristics: Age (years), Number: 22.55 ± 15.63
Category: , Characteristics: Adults (%), Number: 166 (43.46)
Category: , Characteristics: Female (%), Number: 205 (53.66)
Category: , Characteristics: Caucasian (%), Number: 287 (75.13)
Category: , Characteristics: African-American (%), Number: 93 (24.35)
Category: Body Conditions, Characteristics: Height (cm), Number: 159.76 ± 15.67
Category: , Characteristics: Weight (kg), Number: 65.79 ± 27.28
Category: Disease Conditions, Characteristics: Healthy (%), Number: 115 (30.10)
Category: , Characteristics: Asthma (%), Number: 190 (49.74)
Category: , Characteristics: Cystic Fibrosis (%), Number: 64 (16.75)
Category: , Characteristics: COPD (%), Number: 8 (2.09)
Category: , Characteristics: Others (%), Number: 5 (1.31)
The contents of this dataset include three parts:
* Raw WAV files
+ "aware\_raw.zip"
- Subfolders are named by the subjects' ID (from 1 to 382). In each subfolder, there are three "cali\_X.wav" files and multiple "test\_X.wav" files
* "cali\_1.wav" : Calibration step 1 - long tube calibration - consists of 12 pairs of received pulses. Each pair contains one wide-band (0-16Hz) pulse and one narrow-band (0-200Hz) pulse. The narrow-band pulses are deperacated and only the wide-band pulses are used. Transmitting period for each pair is 0.4 sec.
* "cali\_2.wav" : Calibration step 2 - standard tube close-end calibration - is the same as above except the calibrations are conducted in a different setting.
* "cali\_3.wav" : Calibration step 3 - standard tube open-end calibration - is the same as above except the calibrations are conducted in a different setting.
* "test\_X.wav" : Test number X - the subject's X-th attempt of conducting the test - follows the protocol of "Nasal breathing -> Inhale -> Exhale -> Inhale -> Exhale -> Inhale -> Exhale". Each segment contains 25 pulses collected at an interval of 0.2 sec. NOTE: Some of the samples have 50 pulses for the "Nasal breathing" phase.
* Segmented and aligned acoustic pulses
+ "aware\_segmented.pkl"
- Python pandas.DataFrame saved as a PICKLE file. Same contents as "aware\_segmented.csv", but without flattening.
- To load the data, use the following commands:
- Column 0-45 : Basic subject information, including demographics, clinical measurements and labels.
* Column 0 : Subject ID
* Column 1-9 : Demographics
* Column 10 : Disease label
* Column 11-21 : Impulse Oscillometry Measurements
* Column 22-30 : Spirometry measurements
* Column 31-44 : Spirometry measurements in post-bronchodilator test
* Column 45 : Test number
- Column 46-55 : Acoustic signals (16-bit PCM, sample rate = 48000) arranged by different phases. Each calibration phase contains 12 measurements and each testing phase contains 25 measurements. Each measurement lasts for 0.1 sec (4800 sample points).
* Column 46 : Phase "Calibration #1". Each element is a [12,4800] NumPy ndarray
* Column 47 : Phase "Calibration #2". Each element is a [12,4800] NumPy ndarray
* Column 48 : Phase "Calibration #3". Each element is a [12,4800] NumPy ndarray
* Column 49 : Phase "Nasal breathing". Each element is a [25,4800] NumPy ndarray
* Column 50 : Phase "Inhale #1". Each element is a [25,4800] NumPy ndarray
* Column 51 : Phase "Inhale #2". Each element is a [25,4800] NumPy ndarray
* Column 52 : Phase "Inhale #3". Each element is a [25,4800] NumPy ndarray
* Column 53 : Phase "Exhale #1". Each element is a [25,4800] NumPy ndarray
* Column 54 : Phase "Exhale #2". Each element is a [25,4800] NumPy ndarray
* Column 55 : Phase "Exhale #3". Each element is a [25,4800] NumPy ndarray
+ "aware\_segmented.csv" and "URL"
- NOTE: Directly loading this CSV file is not recommended. It could be time-consuming and buggy. Try "aware\_segmented.pkl" first.
- NOTE: "aware\_segmented.csv" has no headers. See column definition below.
- Column 0-45 : Basic subject information, including demographics, clinical measurements and labels.
* Column 0 : Subject ID
* Column 1-9 : Demographics
* Column 10 : Disease label
* Column 11-21 : Impulse Oscillometry Measurements
* Column 22-30 : Spirometry measurements
* Column 31-44 : Spirometry measurements in post-bronchodilator test
* Column 45 : Test number
- Column 46-1012845 : Acoustic signals (16-bit PCM, sample rate = 48000) arranged by different phases. Each calibration phase contains 12 measurements and each testing phase contains 25 measurements. Each measurement lasts for 0.1 sec (4800 sample points).
* Column 46-57645 : Phase "Calibration #1". Flattened from [12,4800] by row-major order.
* Column 57646-115245 : Phase "Calibration #2". Flattened from [12,4800] by row-major order.
* Column 115246-172845 : Phase "Calibration #3". Flattened from [12,4800] by row-major order.
* Column 172846-292845 : Phase "Nasal breathing". Flattened from [25,4800] by row-major order.
* Column 292846-412845 : Phase "Inhale #1". Flattened from [25,4800] by row-major order.
* Column 412846-532845 : Phase "Inhale #2". Flattened from [25,4800] by row-major order.
* Column 532846-652845 : Phase "Inhale #3". Flattened from [25,4800] by row-major order.
* Column 652846-772845 : Phase "Exhale #1". Flattened from [25,4800] by row-major order.
* Column 772846-892845 : Phase "Exhale #2". Flattened from [25,4800] by row-major order.
* Column 892846-1012845 : Phase "Exhale #3". Flattened from [25,4800] by row-major order.
* Processed airway cross-sectional area (CSA) curves inferred from the acoustic signals
+ "aware\_csa.csv"
- NOTE: Not all raw samples are used for airway CSA reconstructions. Low-quality data is discarded, thus the sample size of this part is smaller than above
- Column 0-45 : Basic subject information. Same as "aware\_segmented.csv".
- Column 46-129: Reconstructed airway cross-sectional area at different depths from mouth.
For more information about this dataset please contact: URL@URL
Citation
========
Use of this dataset in publications must be acknowledged by referencing the following publication:
Other Related Publications:
===========================
License
=======
[](URL)
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](URL).
[](URL)
| [] | [
"TAGS\n#task_categories-audio-classification #license-cc-by-nc-sa-4.0 #medical #region-us \n"
] |
39bbc889e6ed6a563c82f880277bf300a827cae1 |
# Dataset Card for Evaluation run of NeuralNovel/Gecko-7B-v0.1-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Gecko-7B-v0.1-DPO](https://huggingface.co/NeuralNovel/Gecko-7B-v0.1-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T21:04:36.728046](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1-DPO/blob/main/results_2024-01-16T21-04-36.728046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6033992573552084,
"acc_stderr": 0.033215570468910556,
"acc_norm": 0.6072820051491797,
"acc_norm_stderr": 0.033890375696348236,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5742349007721483,
"mc2_stderr": 0.015183840204619218
},
"harness|arc:challenge|25": {
"acc": 0.5187713310580204,
"acc_stderr": 0.014601090150633964,
"acc_norm": 0.5674061433447098,
"acc_norm_stderr": 0.014478005694182524
},
"harness|hellaswag|10": {
"acc": 0.6263692491535551,
"acc_stderr": 0.004827786289074842,
"acc_norm": 0.8238398725353515,
"acc_norm_stderr": 0.003801777779809583
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02459497512892094,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02459497512892094
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.02692344605930284,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.02692344605930284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.038969819642573754,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.038969819642573754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.01458381246586254,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.01458381246586254
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647897,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647897
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34301675977653634,
"acc_stderr": 0.015876912673057738,
"acc_norm": 0.34301675977653634,
"acc_norm_stderr": 0.015876912673057738
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001865,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001865
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4217731421121252,
"acc_stderr": 0.012612974369390977,
"acc_norm": 0.4217731421121252,
"acc_norm_stderr": 0.012612974369390977
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6094771241830066,
"acc_stderr": 0.0197370089980946,
"acc_norm": 0.6094771241830066,
"acc_norm_stderr": 0.0197370089980946
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5742349007721483,
"mc2_stderr": 0.015183840204619218
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698341
},
"harness|gsm8k|5": {
"acc": 0.45034116755117515,
"acc_stderr": 0.013704390498582816
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1-DPO | [
"region:us"
] | 2024-01-16T21:06:51+00:00 | {"pretty_name": "Evaluation run of NeuralNovel/Gecko-7B-v0.1-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeuralNovel/Gecko-7B-v0.1-DPO](https://huggingface.co/NeuralNovel/Gecko-7B-v0.1-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T21:04:36.728046](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1-DPO/blob/main/results_2024-01-16T21-04-36.728046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6033992573552084,\n \"acc_stderr\": 0.033215570468910556,\n \"acc_norm\": 0.6072820051491797,\n \"acc_norm_stderr\": 0.033890375696348236,\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5742349007721483,\n \"mc2_stderr\": 0.015183840204619218\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5187713310580204,\n \"acc_stderr\": 0.014601090150633964,\n \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182524\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6263692491535551,\n \"acc_stderr\": 0.004827786289074842,\n \"acc_norm\": 0.8238398725353515,\n \"acc_norm_stderr\": 0.003801777779809583\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797611,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797611\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.6319444444444444,\n \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956914,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956914\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02459497512892094,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02459497512892094\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n \"acc_stderr\": 0.02692344605930284,\n \"acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.02692344605930284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n \"acc_stderr\": 0.01458381246586254,\n \"acc_norm\": 0.789272030651341,\n \"acc_norm_stderr\": 0.01458381246586254\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34301675977653634,\n \"acc_stderr\": 0.015876912673057738,\n \"acc_norm\": 0.34301675977653634,\n \"acc_norm_stderr\": 0.015876912673057738\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001865,\n \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001865\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n \"acc_stderr\": 0.012612974369390977,\n \"acc_norm\": 0.4217731421121252,\n \"acc_norm_stderr\": 0.012612974369390977\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6094771241830066,\n \"acc_stderr\": 0.0197370089980946,\n \"acc_norm\": 0.6094771241830066,\n \"acc_norm_stderr\": 0.0197370089980946\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5742349007721483,\n \"mc2_stderr\": 0.015183840204619218\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698341\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45034116755117515,\n \"acc_stderr\": 0.013704390498582816\n }\n}\n```", "repo_url": "https://huggingface.co/NeuralNovel/Gecko-7B-v0.1-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|arc:challenge|25_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|gsm8k|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hellaswag|10_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T21-04-36.728046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["**/details_harness|winogrande|5_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T21-04-36.728046.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T21_04_36.728046", "path": ["results_2024-01-16T21-04-36.728046.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T21-04-36.728046.parquet"]}]}]} | 2024-01-16T21:07:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NeuralNovel/Gecko-7B-v0.1-DPO
Dataset automatically created during the evaluation run of model NeuralNovel/Gecko-7B-v0.1-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T21:04:36.728046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NeuralNovel/Gecko-7B-v0.1-DPO\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Gecko-7B-v0.1-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T21:04:36.728046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NeuralNovel/Gecko-7B-v0.1-DPO\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Gecko-7B-v0.1-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T21:04:36.728046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
28bc4ef26c95a93add35fc6aa55810d4def0deac |
# Dataset Card for Evaluation run of rombodawg/Leaderboard-killer-MoE_4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rombodawg/Leaderboard-killer-MoE_4x7b](https://huggingface.co/rombodawg/Leaderboard-killer-MoE_4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rombodawg__Leaderboard-killer-MoE_4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T21:07:48.403934](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Leaderboard-killer-MoE_4x7b/blob/main/results_2024-01-16T21-07-48.403934.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6491805510695005,
"acc_stderr": 0.03201485523454187,
"acc_norm": 0.6516714171403527,
"acc_norm_stderr": 0.032647659075440504,
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.5075038654523923,
"mc2_stderr": 0.015124303689968813
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.014301752223279531,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.6297550288787094,
"acc_stderr": 0.0048188335213403535,
"acc_norm": 0.8196574387572196,
"acc_norm_stderr": 0.003836867708701993
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601684,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601684
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997606,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997606
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064076,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064076
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.01596103667523096,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.01596103667523096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218894,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218894
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.5075038654523923,
"mc2_stderr": 0.015124303689968813
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.012108365307437528
},
"harness|gsm8k|5": {
"acc": 0.621683093252464,
"acc_stderr": 0.013358407831777105
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_rombodawg__Leaderboard-killer-MoE_4x7b | [
"region:us"
] | 2024-01-16T21:10:05+00:00 | {"pretty_name": "Evaluation run of rombodawg/Leaderboard-killer-MoE_4x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [rombodawg/Leaderboard-killer-MoE_4x7b](https://huggingface.co/rombodawg/Leaderboard-killer-MoE_4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rombodawg__Leaderboard-killer-MoE_4x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T21:07:48.403934](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Leaderboard-killer-MoE_4x7b/blob/main/results_2024-01-16T21-07-48.403934.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6491805510695005,\n \"acc_stderr\": 0.03201485523454187,\n \"acc_norm\": 0.6516714171403527,\n \"acc_norm_stderr\": 0.032647659075440504,\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.5075038654523923,\n \"mc2_stderr\": 0.015124303689968813\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279531,\n \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6297550288787094,\n \"acc_stderr\": 0.0048188335213403535,\n \"acc_norm\": 0.8196574387572196,\n \"acc_norm_stderr\": 0.003836867708701993\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101737,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997606,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997606\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064076,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064076\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n \"acc_stderr\": 0.01596103667523096,\n \"acc_norm\": 0.35083798882681566,\n \"acc_norm_stderr\": 0.01596103667523096\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218894,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218894\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174934,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174934\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.5075038654523923,\n \"mc2_stderr\": 0.015124303689968813\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437528\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.621683093252464,\n \"acc_stderr\": 0.013358407831777105\n }\n}\n```", "repo_url": "https://huggingface.co/rombodawg/Leaderboard-killer-MoE_4x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|arc:challenge|25_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|gsm8k|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hellaswag|10_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T21-07-48.403934.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["**/details_harness|winogrande|5_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T21-07-48.403934.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T21_07_48.403934", "path": ["results_2024-01-16T21-07-48.403934.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T21-07-48.403934.parquet"]}]}]} | 2024-01-16T21:10:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rombodawg/Leaderboard-killer-MoE_4x7b
Dataset automatically created during the evaluation run of model rombodawg/Leaderboard-killer-MoE_4x7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T21:07:48.403934(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of rombodawg/Leaderboard-killer-MoE_4x7b\n\n\n\nDataset automatically created during the evaluation run of model rombodawg/Leaderboard-killer-MoE_4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T21:07:48.403934(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rombodawg/Leaderboard-killer-MoE_4x7b\n\n\n\nDataset automatically created during the evaluation run of model rombodawg/Leaderboard-killer-MoE_4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T21:07:48.403934(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
62c2be598f4ec6a19e7421f184a42ae618e2e739 |
This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information.
https://huggingface.co/datasets/biglab/webui-70k
```
from datasets import load_dataset
dataset = load_dataset("biglab/webui-70k-elements")
``` | biglab/webui-70k-elements | [
"region:us"
] | 2024-01-16T21:23:06+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "labels", "sequence": {"sequence": "string"}}, {"name": "contentBoxes", "sequence": {"sequence": "float64"}}, {"name": "paddingBoxes", "sequence": {"sequence": "float64"}}, {"name": "borderBoxes", "sequence": {"sequence": "float64"}}, {"name": "marginBoxes", "sequence": {"sequence": "float64"}}, {"name": "key_name", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12719410165.962, "num_examples": 173546}], "download_size": 11396715289, "dataset_size": 12719410165.962}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-23T02:36:07+00:00 | [] | [] | TAGS
#region-us
|
This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information.
URL
| [] | [
"TAGS\n#region-us \n"
] |
25d7c8dacd80c42eed0d5b20833a831da02975b1 |
# Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LoSboccacc/orthogonal-2x7B-base](https://huggingface.co/LoSboccacc/orthogonal-2x7B-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T21:21:27.618218](https://huggingface.co/datasets/open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base/blob/main/results_2024-01-16T21-21-27.618218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6260063657620419,
"acc_stderr": 0.03286232914053458,
"acc_norm": 0.6295442881937665,
"acc_norm_stderr": 0.033515089160485206,
"mc1": 0.49326805385556916,
"mc1_stderr": 0.017501914492655386,
"mc2": 0.6600496157622183,
"mc2_stderr": 0.015282722255268989
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349812,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.013752062419817836
},
"harness|hellaswag|10": {
"acc": 0.6698864767974507,
"acc_stderr": 0.004692926794268465,
"acc_norm": 0.8554072893845848,
"acc_norm_stderr": 0.0035097096477918416
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.02486499515976775,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.02486499515976775
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612893,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.035208939510976534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073318,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073318
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188943,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333334,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49326805385556916,
"mc1_stderr": 0.017501914492655386,
"mc2": 0.6600496157622183,
"mc2_stderr": 0.015282722255268989
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838232
},
"harness|gsm8k|5": {
"acc": 0.5079605761940864,
"acc_stderr": 0.013770739063135374
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base | [
"region:us"
] | 2024-01-16T21:23:43+00:00 | {"pretty_name": "Evaluation run of LoSboccacc/orthogonal-2x7B-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [LoSboccacc/orthogonal-2x7B-base](https://huggingface.co/LoSboccacc/orthogonal-2x7B-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T21:21:27.618218](https://huggingface.co/datasets/open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base/blob/main/results_2024-01-16T21-21-27.618218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6260063657620419,\n \"acc_stderr\": 0.03286232914053458,\n \"acc_norm\": 0.6295442881937665,\n \"acc_norm_stderr\": 0.033515089160485206,\n \"mc1\": 0.49326805385556916,\n \"mc1_stderr\": 0.017501914492655386,\n \"mc2\": 0.6600496157622183,\n \"mc2_stderr\": 0.015282722255268989\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349812,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817836\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6698864767974507,\n \"acc_stderr\": 0.004692926794268465,\n \"acc_norm\": 0.8554072893845848,\n \"acc_norm_stderr\": 0.0035097096477918416\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612893,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073318,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073318\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188943,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49326805385556916,\n \"mc1_stderr\": 0.017501914492655386,\n \"mc2\": 0.6600496157622183,\n \"mc2_stderr\": 0.015282722255268989\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838232\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5079605761940864,\n \"acc_stderr\": 0.013770739063135374\n }\n}\n```", "repo_url": "https://huggingface.co/LoSboccacc/orthogonal-2x7B-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|arc:challenge|25_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|gsm8k|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hellaswag|10_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T21-21-27.618218.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["**/details_harness|winogrande|5_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T21-21-27.618218.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_16T21_21_27.618218", "path": ["results_2024-01-16T21-21-27.618218.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T21-21-27.618218.parquet"]}]}]} | 2024-01-16T21:24:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-base
Dataset automatically created during the evaluation run of model LoSboccacc/orthogonal-2x7B-base on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-16T21:21:27.618218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-base\n\n\n\nDataset automatically created during the evaluation run of model LoSboccacc/orthogonal-2x7B-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T21:21:27.618218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-base\n\n\n\nDataset automatically created during the evaluation run of model LoSboccacc/orthogonal-2x7B-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-16T21:21:27.618218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7df0ffc4ffb2cbf8c282b9c5616de4b77877054d |
# Dataset of y (Pokémon)
This is the dataset of y (Pokémon), containing 15 images and their tags.
The core tags of this character are `blonde_hair, short_hair, bangs, blue_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:-----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 6.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 5.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 8.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 6.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 9.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/y_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, sleeveless_shirt, looking_at_viewer, red_skirt, smile, solo, black_shirt, pleated_skirt, white_background, black_thighhighs, open_mouth, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | sleeveless_shirt | looking_at_viewer | red_skirt | smile | solo | black_shirt | pleated_skirt | white_background | black_thighhighs | open_mouth | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------------------|:------------|:--------|:-------|:--------------|:----------------|:-------------------|:-------------------|:-------------|:--------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/y_pokemon | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-16T21:25:09+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-16T21:30:38+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of y (Pokémon)
======================
This is the dataset of y (Pokémon), containing 15 images and their tags.
The core tags of this character are 'blonde\_hair, short\_hair, bangs, blue\_eyes, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
fc146754775083eee2ee80bea1741f43767d9b0d |
# Dataset of kirino_aya/桐野アヤ (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kirino_aya/桐野アヤ (THE iDOLM@STER: Cinderella Girls), containing 24 images and their tags.
The core tags of this character are `black_hair, long_hair, brown_eyes, earrings, single_hair_bun, hair_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 21.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 16.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 52 | 31.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 19.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 52 | 37.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kirino_aya_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, jewelry, gloves, one_eye_closed, smile, breasts, card_(medium), character_name, dress, gem_(symbol) |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | jewelry | gloves | one_eye_closed | smile | breasts | card_(medium) | character_name | dress | gem_(symbol) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:---------|:-----------------|:--------|:----------|:----------------|:-----------------|:--------|:---------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kirino_aya_idolmastercinderellagirls | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-16T21:42:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-16T21:48:52+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kirino\_aya/桐野アヤ (THE iDOLM@STER: Cinderella Girls)
==============================================================
This is the dataset of kirino\_aya/桐野アヤ (THE iDOLM@STER: Cinderella Girls), containing 24 images and their tags.
The core tags of this character are 'black\_hair, long\_hair, brown\_eyes, earrings, single\_hair\_bun, hair\_bun', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
afa320b9cfb47a27f4d7f1f046c43e3b3ae21bd8 |
This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information.
https://huggingface.co/datasets/biglab/webui-val
```
from datasets import load_dataset
dataset = load_dataset("biglab/webui-val-elements")
```
NOTE: this is the validation split of the WebUI dataset, even though in the converted version, the split is named "train" | biglab/webui-val-elements | [
"region:us"
] | 2024-01-16T21:45:04+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "labels", "sequence": {"sequence": "string"}}, {"name": "contentBoxes", "sequence": {"sequence": "float64"}}, {"name": "paddingBoxes", "sequence": {"sequence": "float64"}}, {"name": "borderBoxes", "sequence": {"sequence": "float64"}}, {"name": "marginBoxes", "sequence": {"sequence": "float64"}}, {"name": "key_name", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1641250089.776, "num_examples": 21168}], "download_size": 1381767281, "dataset_size": 1641250089.776}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-23T02:39:09+00:00 | [] | [] | TAGS
#region-us
|
This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information.
URL
NOTE: this is the validation split of the WebUI dataset, even though in the converted version, the split is named "train" | [] | [
"TAGS\n#region-us \n"
] |
4f3a3e097b5bbc7f680c153d344f072eabe85c4d |
# Dataset of opal/ポプラ (Pokémon)
This is the dataset of opal/ポプラ (Pokémon), containing 39 images and their tags.
The core tags of this character are `eyeshadow, white_hair, hat, short_hair, long_hair, purple_eyeshadow, black_hair, multicolored_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 39 | 22.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/opal_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 39 | 17.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/opal_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 23.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/opal_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 39 | 21.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/opal_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 28.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/opal_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/opal_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, makeup, old_woman, smile, closed_mouth, solo, dress, holding, looking_at_viewer, nail_polish, umbrella, bangle, purple_scarf, single_glove, feather_boa |
| 1 | 10 |  |  |  |  |  | male_focus, shirt, two-tone_hair, 1boy, bangs, makeup, green_eyes, white_jacket, closed_mouth, upper_body, cropped_jacket, gloves, hair_over_one_eye, holding, solo |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | makeup | old_woman | smile | closed_mouth | solo | dress | holding | looking_at_viewer | nail_polish | umbrella | bangle | purple_scarf | single_glove | feather_boa | male_focus | shirt | two-tone_hair | 1boy | bangs | green_eyes | white_jacket | upper_body | cropped_jacket | gloves | hair_over_one_eye |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:------------|:--------|:---------------|:-------|:--------|:----------|:--------------------|:--------------|:-----------|:---------|:---------------|:---------------|:--------------|:-------------|:--------|:----------------|:-------|:--------|:-------------|:---------------|:-------------|:-----------------|:---------|:--------------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | | X | | | X | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/opal_pokemon | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-16T21:48:08+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-16T22:00:50+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of opal/ポプラ (Pokémon)
=============================
This is the dataset of opal/ポプラ (Pokémon), containing 39 images and their tags.
The core tags of this character are 'eyeshadow, white\_hair, hat, short\_hair, long\_hair, purple\_eyeshadow, black\_hair, multicolored\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
fc3618d6565e2155cda30e098694aa4d7c2ef3e5 |
# Material Contracts (Exhibit 10) from SEC 10-K, 8-K, and 10-Q filings from 1994 to 2024.
WIP. I started the crawling on Jan 14, 2024, and expect to finish in a month or two. I will upload the data in batches.
| chenghao/sec-material-contracts | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:cc-by-nc-sa-4.0",
"legal",
"finance",
"region:us"
] | 2024-01-16T21:51:13+00:00 | {"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "pretty_name": "SEC Material Contracts (Exhibit 10)", "tags": ["legal", "finance"], "dataset_info": {"features": [{"name": "index_html_url", "dtype": "string"}, {"name": "index_text_url", "dtype": "string"}, {"name": "cik", "dtype": "int64"}, {"name": "name", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "date", "dtype": "timestamp[ns]"}, {"name": "seq", "dtype": "int64"}, {"name": "desc", "dtype": "string"}, {"name": "doc_type", "dtype": "string"}, {"name": "size", "dtype": "int64"}, {"name": "filename", "dtype": "string"}, {"name": "file_url", "dtype": "string"}, {"name": "file", "dtype": "string"}, {"name": "file_content", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 158734794, "num_examples": 1000}], "download_size": 35156538, "dataset_size": 158734794}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-09T14:30:13+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-cc-by-nc-sa-4.0 #legal #finance #region-us
|
# Material Contracts (Exhibit 10) from SEC 10-K, 8-K, and 10-Q filings from 1994 to 2024.
WIP. I started the crawling on Jan 14, 2024, and expect to finish in a month or two. I will upload the data in batches.
| [
"# Material Contracts (Exhibit 10) from SEC 10-K, 8-K, and 10-Q filings from 1994 to 2024.\n\nWIP. I started the crawling on Jan 14, 2024, and expect to finish in a month or two. I will upload the data in batches."
] | [
"TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-cc-by-nc-sa-4.0 #legal #finance #region-us \n",
"# Material Contracts (Exhibit 10) from SEC 10-K, 8-K, and 10-Q filings from 1994 to 2024.\n\nWIP. I started the crawling on Jan 14, 2024, and expect to finish in a month or two. I will upload the data in batches."
] |
6712c45a6016835663c2eb28bedc024ee0bfb534 |
### Dataset Description
Legal Contracts Dataset for Training NER Model
This repository contains a specially curated dataset consisting of legal contracts. It is designed for the purpose of training a Named Entity Recognition (NER) model, with the aim to recognize and classify four types of entities in the text:
Contract Type,
Clause Title,
Clause Number,
Definition Title
The dataset includes a broad variety of legal contracts, covering diverse domains such as employment, real estate, services, sale, lease, etc.
Entities in the text have been manually labeled by experts in the field, ensuring high-quality training data for the model.
Each document in the dataset has been annotated in the following format:
(Start_Position, End_Position, Entity_Label)
For example, a clause title may be annotated as follows: (102, 115, 'clause title')
This will assist the NER model in identifying not only the text of the entity, but also its position within the document.
Usage Guidelines
| lawinsider/uk_ner_contract | [
"task_categories:token-classification",
"task_ids:named-entity-recognition",
"language:uk",
"region:us"
] | 2024-01-16T21:51:59+00:00 | {"language": ["uk"], "task_categories": ["token-classification"], "task_ids": ["named-entity-recognition"], "pretty_name": "UK-NER-contracts"} | 2024-01-16T23:57:00+00:00 | [] | [
"uk"
] | TAGS
#task_categories-token-classification #task_ids-named-entity-recognition #language-Ukrainian #region-us
|
### Dataset Description
Legal Contracts Dataset for Training NER Model
This repository contains a specially curated dataset consisting of legal contracts. It is designed for the purpose of training a Named Entity Recognition (NER) model, with the aim to recognize and classify four types of entities in the text:
Contract Type,
Clause Title,
Clause Number,
Definition Title
The dataset includes a broad variety of legal contracts, covering diverse domains such as employment, real estate, services, sale, lease, etc.
Entities in the text have been manually labeled by experts in the field, ensuring high-quality training data for the model.
Each document in the dataset has been annotated in the following format:
(Start_Position, End_Position, Entity_Label)
For example, a clause title may be annotated as follows: (102, 115, 'clause title')
This will assist the NER model in identifying not only the text of the entity, but also its position within the document.
Usage Guidelines
| [
"### Dataset Description\n\nLegal Contracts Dataset for Training NER Model\nThis repository contains a specially curated dataset consisting of legal contracts. It is designed for the purpose of training a Named Entity Recognition (NER) model, with the aim to recognize and classify four types of entities in the text:\n\nContract Type,\nClause Title,\nClause Number,\nDefinition Title\n\n\nThe dataset includes a broad variety of legal contracts, covering diverse domains such as employment, real estate, services, sale, lease, etc.\n\nEntities in the text have been manually labeled by experts in the field, ensuring high-quality training data for the model.\n\nEach document in the dataset has been annotated in the following format:\n\n(Start_Position, End_Position, Entity_Label)\n\nFor example, a clause title may be annotated as follows: (102, 115, 'clause title')\n\nThis will assist the NER model in identifying not only the text of the entity, but also its position within the document.\n\nUsage Guidelines"
] | [
"TAGS\n#task_categories-token-classification #task_ids-named-entity-recognition #language-Ukrainian #region-us \n",
"### Dataset Description\n\nLegal Contracts Dataset for Training NER Model\nThis repository contains a specially curated dataset consisting of legal contracts. It is designed for the purpose of training a Named Entity Recognition (NER) model, with the aim to recognize and classify four types of entities in the text:\n\nContract Type,\nClause Title,\nClause Number,\nDefinition Title\n\n\nThe dataset includes a broad variety of legal contracts, covering diverse domains such as employment, real estate, services, sale, lease, etc.\n\nEntities in the text have been manually labeled by experts in the field, ensuring high-quality training data for the model.\n\nEach document in the dataset has been annotated in the following format:\n\n(Start_Position, End_Position, Entity_Label)\n\nFor example, a clause title may be annotated as follows: (102, 115, 'clause title')\n\nThis will assist the NER model in identifying not only the text of the entity, but also its position within the document.\n\nUsage Guidelines"
] |
22c0963526694423cb3bbd76a8010e63d4a6af9f |
# Dataset of cathy_graham/キャシー・グラハム (THE iDOLM@STER: Cinderella Girls)
This is the dataset of cathy_graham/キャシー・グラハム (THE iDOLM@STER: Cinderella Girls), containing 20 images and their tags.
The core tags of this character are `short_hair, earrings, brown_hair, blue_eyes, blonde_hair, thick_eyebrows, ahoge, aqua_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 14.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cathy_graham_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 12.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cathy_graham_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 33 | 18.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cathy_graham_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 14.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cathy_graham_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 33 | 21.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cathy_graham_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cathy_graham_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, smile, solo, jewelry, card_(medium), character_name, hair_ornament, one_eye_closed, open_mouth, sun_symbol, thighhighs, dress, flower, hat, orange_background, skates, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | jewelry | card_(medium) | character_name | hair_ornament | one_eye_closed | open_mouth | sun_symbol | thighhighs | dress | flower | hat | orange_background | skates | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:----------|:----------------|:-----------------|:----------------|:-----------------|:-------------|:-------------|:-------------|:--------|:---------|:------|:--------------------|:---------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/cathy_graham_idolmastercinderellagirls | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-16T21:54:23+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-16T21:59:28+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of cathy\_graham/キャシー・グラハム (THE iDOLM@STER: Cinderella Girls)
=====================================================================
This is the dataset of cathy\_graham/キャシー・グラハム (THE iDOLM@STER: Cinderella Girls), containing 20 images and their tags.
The core tags of this character are 'short\_hair, earrings, brown\_hair, blue\_eyes, blonde\_hair, thick\_eyebrows, ahoge, aqua\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
Subsets and Splits